There appears to be a furore building on the Internet at the moment surrounding whether games publishers should be patching games or not. It appears that there are a disgruntled few in the gaming community who believe that the "new world" of online games consoles are allowing publishers and developers to be more sloppy in making games, and that some games are shipping half-finished. It's an interesting one for me as I've been a player of MMORPG games for years, and so having patches and updates download is nothing new. However for console gamers this is a whole new world.
So before we look at the arguments and points being raised, let me just say one thing. We're not talking about the release of new game mechanics or additional content here, we're purely talking about fixing problems and removal of bugs. So essentially we're looking at game developers discovering problems and applying fixes to them. So lets take a look at some of the arguments here:
Impact on non-online players?
As things stand at the moment, all of these new features are being rolled out via some form of online service. Which is all well and good if, like me, your games console (an Xbox 360 in my case) is always connected to the Internet. However, what happens if you can't, or chose not to, connect your console to the Internet? Then you're stuck with a game that, potentially, has game-crippling bugs and problems.
Imagine the frustration of having played for 20-30 hours in a game and then finding you can't complete the game because of a bug. Very frustrating, and probably very costly for the developers (lets face it, how many people would buy a sequel to a game if they had hit this problem in the original?). So the impact on non-online players needs to be considered here.
MMOs have always had patching
As I mentioned in my introduction, I have played Massively Multiplayer Online games for a number of years. When you buy an MMO, the developer knows that you have an online connection, and therefore knows that any updates that they want to apply can be easily distributed to you. For those not familiar with MMOs, you tend to have a "launchpad" application to start the game. Before you can enter the game world this launchpad will check the versions of the game files that are in use, and if there are any newer files you have to download them.
Refuse to download them and you can't enter the game world, simple as that. This is where things are different with the console world too. Sure, you can stop someone from playing in an online multiplayer game if they don't download patches, but you can't stop them playing through a "campaign" style game by themselves just because they won't apply fixes. With MMOs, patching is a way of life. It's easy to fix problems and easy to roll-out these fixes due to the nature of the customer base and the product itself. Not so with a console.
Would Windows users complain?
It's odd as I can't imagine any other group of users (Microsoft Windows users for example) complaining about having free fixes applied to the piece of software that they use. However, with gamers it seems that they feel differently (I'm not going to get into a discussion about the merits of the relevant License Agreements at this point and who actually "owns" the product). It seems that gamers feel like they have been "conned" if they have bought a particular game, and then the developer admits that there is a problem and supplies a fix to it. I've never bought into that mentality myself but I do know that it exists. For me, I'd rather have a game that functions as well as possible, even if this means downloading a small fix to a problem.
Is the game finished when it goes gold?
So this is what the crux of the discussion seems to be for some people. When a game goes gold, it is deemed to be ready for retail, and that the majority of the bugs have been worked out of the product. And there's a key word in that sentence. The "majority" of bugs have been worked out. I can't imagine any game developers (or software developer for that matter) ever putting their hands up and saying that their software is 100% bug-free. It just never happens.
What some gamers seem to think though is that they have paid for a product and therefore it should be perfect. How could developers ever consider rolling something out that has bugs? (Some people even now have the opinion that the game isn't finished if it has bugs in it – these could be the same people who say that development cycles are currently too long!) And do they really believe that a lot of other consumer products are rolled out with absolutely zero problems?
So is it good or bad?
Well I guess it depends on your point of view. For me, I'd say it's absolutely a good thing. I'd much rather have a steady stream of games appearing, and allow the developers to apply fixes as they need to, than wait for months between game releases. And lets face it, it's not every game that falls foul of this. For me it comes down to a matter of expectations. I expect a game that I buy to be at a certain quality level, and I accept that there may be some problems that need fixing. However, others want the utopia of having perfect games available, when they want them, at a price that they want to pay. Obviously these people have never worked in software development!