Rob Fahey Contributing Editor Friday 19th December 2014 Share this article Share

Recently, my smartphone started acting up. I think the battery is on the way out; it does bizarre things, like shutting itself off entirely when I try to take a picture on 60per cent battery, or suddenly dropping from fully charged to giving me "10per cent remaining, plug me in or else" warnings for no reason at all. I can get it fixed free of charge, but it's an incredibly frustrating, bothersome thing, especially given how much money I've paid for this phone. Most of us have probably had an experience like this with a piece of hardware; a shoddy washing machine that mangled your favourite shirt, a shiny new LCD screen with an intensely irritating dead pixel, an Xbox 360 whose Red Ring of Death demanded a lengthy trip back to the service centre. There are few of us who can't identify with the utter frustration of having a consumer product that you've paid good money for simply fail to do its job properly. Sure, it's a #FirstWorldProblem for the most part (unless it's something like a faulty airbag in your Honda, obviously), but it's intensely annoying and certainly makes you less likely to buy anything from that manufacturer again.

Given that we could all probably agree that a piece of hardware being faulty is utterly unacceptable, I'm not sure why software seems to get a free pass sometimes. Sure, there are lots of consumers who complain bitterly about buggy games, but by and large games with awful quality control problems tend to get slapped with labels like "flawed but great", or have their enormous faults explained in a review only to see the final score reflect none of those problems. It's not just the media that does this (and for what it's worth, I don't think this is corruption so much as an ill-considered aspect of media culture itself); for every broken game, there are a host of consumers out there ready to defend it to the hilt, for whatever reason.

"It's not just the media that does this...for every broken game, there are a host of consumers out there ready to defend it to the hilt, for whatever reason"

I raise this problem because, while buggy games have always been with us - often hilariously, especially back in the early days of the PlayStation - the past year or so has seen a spate of high-profile, problematic games being launched, suggesting that even some of the industry's AAA titles are no longer free from truly enormous technical issues. The technical problems that have become increasingly prevalent in recent years are causing genuine damage to the industry; from the botched online launches of games like Driveclub and Battlefield through to the horrendous graphical problems that plague some players of Assassin's Creed Unity, they are giving consumers terrible experiences of what should be high points for the medium, creating a loud and outspoken group of disgruntled players who act to discourage others, and helping to drive a huge wedge between media (who, understandably, want to talk about the experience and context of a game rather than its technical details) and consumers (who consider a failure to address glaring bugs to be a sign of collusion between media and publishers, and a failure on the part of the media to serve their audience).

We can all guess why this is happening. I don't wish in any way to underplay how complex and difficult it is to develop bug-free software; I write software tools to assist in my research work, and given how often those simple tools, developed by two or three people at most, have me tearing my hair out at 3am as I search for the single misplaced character that's causing the whole project to behave oddly, I am absolutely the last person in the world who is going to dismiss the difficulty involved in debugging something as enormous and complex as a modern videogame. Debugging games has inevitably become harder as team sizes and technical complexity has grown; that's to be expected.

"I don't wish in any way to underplay how complex and difficult it is to develop bug-free software"

However, just because something is harder doesn't mean it shouldn't be happening, and that's the second part of this problem. Games are developed to incredibly tight schedules, sometimes even tighter today (given the culture of annual updates to core franchises) than they were in the past. Enormous marketing budgets are preallocated and planned out to support a specific release date. The game can't miss that date; if there are show-stopping bugs, the game will just have to ship with those in place, and with a bit of luck they'll be able to fix them in time to issue a day-one digital patch (and if your console isn't online, tough luck).

Yet this situation is artificial in itself. It's entirely possible to structure your company's various divisions around the notion that a game will launch when it's actually ready, and ensure that you only turn out high-quality software; Nintendo, in particular, manages this admirably. Certainly, some people criticise the company for delaying software and it does open up gaps in the release schedule, but compared to the enormous opprobrium which would be heaped upon the company if it turned out a Mario Kart game where players kept falling through the track, or a Legend of Zelda where Link's face kept disappearing, leaving only eyes and teeth floating ghoulishly in negative space (sleep well, kids!), an occasional delay is a corporate cultural decision that makes absolute sense - not only for Nintendo, but for game companies in general.

It doesn't even have to go as far as delaying games on a regular basis. There is a strong sense that some of the worst offenders in terms of buggy games simply aren't taking QA seriously, which is something that absolutely needs to be fixed - and if not, deserves significant punishment from consumers and critics alike. Quality control has a bit of an image problem; there's a standard stereotype of a load of pizza-fuelled youngsters in their late teens testing games for a few years as they try to break into a "real" games industry job. The image doesn't come from thin air; for some companies, this is absolutely a reality. It is, however, utterly false to think that every company sees its QA in those terms. For companies that take QA seriously, it's a division that's respected and well-treated, with its own career progression tracks, all founded on the basic understanding that a truly good QA engineer is worth his or her weight in gold.

"Not prioritising your QA department - not ensuring that it's a division that's filling up with talented, devoted people who see QA as potentially being a real career and not just a stepping stone - is exactly the same thing as not prioritising your consumers"

Not prioritising your QA department - not ensuring that it's a division that's filling up with talented, devoted people who see QA as potentially being a real career and not just a stepping stone - is exactly the same thing as not prioritising your consumers. Not building time for proper QA into your schedules, or failing to enact processes which ensure that QA is being properly listened to and involved, is nothing short of a middle finger raised to your entire consumer base - and you only get to do that so many times before your consumers start giving the gesture right back to you and your precious franchises.

Media does absolutely have a role to play in this - one to which it has, by and large, not lived up. Games with serious QA problems do not deserve critical acclaim. I understand fully that reviewers want to engage with more interesting topics than technical issues, but I think it's worth thinking about how film reviewers would treat a movie with unfinished special effects or audio mixed such that voices can't be heard; or perhaps how music reviewers would treat an album with a nasty recording hiss in the background, or with certain tracks accidentally dropping out or skipping. Regardless of the good intentions of the creative people involved in these projects, the resulting product would be slammed, and rightly so. It's perhaps the very knowledge of the drubbing that they would receive that means that such awful movies and albums almost never see the light of day (and when they do, they become legendary in their awfulness; consider the unfinished CGI at the end of "The Scorpion King", which remains a watchword for terrible special effects many years later).

Game companies, by contrast, seem to feel unpleasantly comfortable with releasing games that don't work and aren't properly tested. Certain technical aspects probably contribute to this; journalists may be wary of slamming a game for bugs that may be fixed in a day-one patch, for instance. Yet it seems that there's little choice but to make the criteria stricter in this regard. If media and consumers alike do not take to punishing companies severely for failing to pay proper respect to QA procedures for their games, this problem will only worsen as firms realise that they they can get away with launching unfinished software.

"We all want a world where technical issues are nothing but a footnote in the discussion of games; that will be the ultimate triumph of game technology"

We all want a world where technical issues are nothing but a footnote in the discussion of games; that will be the ultimate triumph of game technology, when it truly becomes transparent. We do not, however, live in that time yet, and the regular launches of games that don't live up to even the most basic standards of quality is something nobody should be asked to tolerate. The move by some websites to stop reviewing online games until the servers are live and populated with real players is a good start; but the overall tolerance for bugs and willingness to forgive publishers for such transgressions ("we know the last game was a buggy mess, but we're still going to publish half a dozen puff pieces that will push our readers to pre-order the sequel!") needs to be fixed. If we want to talk about the things that are important about games (and we do!), it's essential that we fix the culture that ignores QA and technical issues first.