When we game critics get together and talk at our clandestine meetings underneath our secret mountain base, we often debate whether our reviews actually have a tangible impact on how many people buy a game. Some will point to low-selling critical darlings like Psychonauts and healthy sales for crappy licensed games as proof that the public doesn't, as a whole, listen to reviewers all that much. Those on the other side will point to strong reviews that helped propel games like Super Meat Boy to sales success, or critical drubbings that seem to have hurt sales for games like Too Human.

Regardless of how much the gamers themselves seem to care, many publishers seem to give outsized attention to review scores, going so far as to use them in determining bonuses for developers. It's a practice that puts too much focus on those scores as a final arbiter of a game's quality, and one that really needs to stop.

Obsidian's near miss

Linking a developer's bonus to a game's Metacritic score is nothing new—the practice has been widely reported in and around the industry for years. But the issue constantly pops back into the public attention as new instances of the practice come to light.

Today, the focus of that attention is developer Obsidian Entertainment, whose creative director and chief creative officer Chris Avellone recently tweeted that the developer didn't receive a bonus for 2010's Fallout: New Vegas from publisher Bethesda because the game didn't reach a target Metacritic score of 85. The actual Metacritic average for the game currently stands at 84 (that's for the PC and Xbox 360 versions; the PS3 version sits at a slightly worse 82).

Obsidian reportedly let go of 20 to 30 people this week, after a similar round of layoffs a year ago. While it's not clear that you can draw a straight line from the company's missed bonus to its recent staffing difficulties, it's not hard to see that Obsidian definitely could have used that extra money.

Improper tool use

By tying a bonus to a specific Metacritic score, publishers seem to be saying that they think positive reviews will have a direct, positive impact on their bottom line through increased sales. To be fair, there is some evidence that review scores do broadly have an effect on game sales, both in a laboratory setting and in real-world examples.

But it's hardly a direct, repeatable relationship, as proven by the numerous situations where review scores and sales levels fail to line up. A game with an 85 average on Metacritic is definitely not guaranteed to sell better than a game that scores an 84, even if things like genre, franchise popularity, and marketing budget were all somehow set exactly equally.

You could argue that review scores are the best way to judge a developer's output independently of other factors that might influence sales, but that's not always the case. Any critic will tell you that being forced to distill the complex and involved experience of playing a game into a single number is often a fool's errand, and that the difference between a game that gets a 7 out of 10 and one that get an 8 is often incredibly slight. Aggregating these scores into a single weighted average, as Metacritic does, just adds another layer of abstraction from the critic's actual determination.

More broadly speaking, though, focusing on a single number can obscure a success that's sitting right in front of one's nose. Out of 81 reviews for the Xbox 360 version of Fallout: New Vegas listed on Metacritic, 72 are characterized as "positive," with nine said to be "mixed" and zero "negative." The game went on to ship 5 million copies and attract $300 million in revenue in its first month. If that isn't a critical and commercial success worthy of a bonus for the developer, I don't know what is.

To be clear, I don't have a problem with Metacritic itself in this matter. The site is just a tool, and one that can be useful as a quick, rough, heuristic guide to the general critical consensus about a game. It's the publishers that are using this tool improperly as some sort of final, objective arbiter for the quality of games its developers are putting out. While it's tempting to look to a third party source that can be used for this kind of important determination, Metacritic isn't it.