I remember the fiasco shortly after Modern Warfare 2 was released; everybody and their uncles would be playing with a javelin and semtex so they could exploit a glitch automatically killing any player who killed the exploiter. We had to wait days for a patch to come through, but we weren't waiting for the patch to be written, we were waiting for it to be certified by console providers. This is just one such example, Battlefield 3 and an endless list of other games have experienced holdups in patches and updates simply because of certification issues. The certification process is designed so that patches and updates go through levels of testing to ensure all software is held to even standards and remains robust so customer perception of their console remains unsullied except for yellow lights, red rings, melting processors, etc... But does this certification process accomplish its goal and is it a process really worth having around?

Time and Money Wasted by Developers
We may not pay much attention to it or realize how much it affects us, but developers are forced to jump through a lot of hoops in order to meet certification standards, even in original versions of games. When discussing this particular area of development, the creator of the game Braid pointed out specifically the obligatory opening window to nearly every game that states we should not shut down our consoles while a symbol is on the screen.

Developers spend hours (the Braid developers claim it would be in the millions or billions of hours when taken across the spectrum of released games) developing this one little required bit. Granted, given the way the save system works on all of the major consoles, this warning and system of saving and not stopping your system is important and prevents the creation of corrupted save files. Therefore, it would be in Microsoft, Sony, and Nintendo's best interests to make sure this is included in all software that is released on the consoles. However, if console providers care so much about the end user experience, why not just change the way save files work?

Without getting horribly technical about it all, corrupted saves are created when this auto-save feature is interrupted. Because the save system operates by saving on top of an existing save file, any break in this procedure could lead to disastrous results and many hours of hard work in a game would disappear in a flash.  But why couldn't console providers change the way the save files work? Why can't console systems switch to a method whereby a new save file is created and the old is not removed by the system until the integrity of the new file is confirmed? Or why not have two different save files, alternating back and forth while keeping one as a failsafe?

It should be known I am no technical savant, but I have heard these arguments made by people with true technical expertise. So, if the end-user experience is the purpose of the certification process shouldn't the console manufactures stitch the proverbial wound rather than just apply a bandage? This could even lead to hundreds of hours saved by developers to assist in becoming profitable or making a game worth playing rather than just taking steps to meet a certification process' requirements.

Expenses with No Offsetting Income
In one of my early blogs, I wrote about the costs required to develop, market, and sell a video game. However, I did not discuss much about the costs that occur after a game has already been sold. In addition to the royalties publishers pay to console providers for the privilege of releasing a video game on that console, the publishers must also pay to have that software certified and any patches that must be certified in the future cost money, both in the development of the patch or update as well as the certification of those patches/updates. In some cases, the associated costs are so large that a publisher may decide the patch is not important enough to release.

Recently, the game Fez encountered a problem whereby a small percentage of users would lose their save game file and need to start all over again. Exercising solid business practices, the developers of the game made and prepared a patch to address the problem. Unfortunately, Microsoft's gastronomically large certification costs were so high, Fez developers Polytron decided it would have to withhold the patch due to cost constraints.

Preventing Patches that Need Patching
On occasion, the certification process does actually become helpful: it prevents the release of hasty patches that need to be patched again. But, even this has seen problems in recent days, like Skyrim. Skyrim had a number of patches released shortly after the game came out and each successive patch seemed to break something else. Most of us would uninstall the patches and play without until a solution was found, developed, and certified again. It leaves us asking ourselves: does the certification process really help all that much? Androids and many other such devices seem to do just fine without the certification process, why can't we leave it in a developer and publisher's hands to make the game they want it to be made?

What the Future Holds
Console providers have admitted the archaic nature of the certification process in the past and have stated that they have intentions to change it in future consoles. Perhaps the massive support behind Ouya will wake up console providers up to the idea that players of video games don't care as much about a certification process as they think we do. Personally, I doubt they will change all that much. The process may be more streamlined or they will add more personnel to try and process the certifications faster, but I don't foresee any massive overhaul in the future.

 What do you think? Is the certification process a necessary one that helps sales and promotes a better gaming experience or just a waste of time and money when most developers need the money to keep from going underwater?