Formatting may be lacking as a result. If this article is un-readable please report it so that we may fix it.

Posted on October 31, 2014, Ron Whitaker Why Resolution Matters – Even When it Doesn’t

There’s been a lot of talk lately about the importance of screen resolutions.

Assassin’s Creed: Unity is locked to 900p (that is, 900 lines of “progressive scan” vertical resolution, if you’re unfamiliar). Halo 2 won’t be 1080p in the Master Chief Collection. But why is a game’s resolution a big deal? It’s just a number, right? The truth is, resolution matters a lot more than you might think, and it’s not because of what it means to the gameplay — it’s what it means to gaming as a whole.

First off, let me acknowledge that anyone who says “Games are about gameplay, not graphics” is completely correct. Graphics don’t make a game. However, you can’t simply dismiss the people who are upset about games not hitting 1080p. After all, where did those expectations come from, if not the very companies who are now saying that resolution doesn’t matter?

Around the launch of the PlayStation 3 and Xbox 360, both Sony and Microsoft were showing off tech demos with amazing graphics. Back then, those consoles were a giant leap forward in graphical quality. But the same thing happened around the PS4 / Xbox One launches, and those consoles were not huge upgrades. That’s when the grumbling started.

Gamers were (understandably) upset to discover that this amazing “next-generation” machine for which they’d paid $400 to $500 couldn’t always max out the resolution on their four-year-old TV sets. It wasn’t that they think games are necessarily worse in 900p; they just wondered what they had shelled out all that money for.

We’ve been marketed to for years on the idea that better graphics equal better games. Sure, we all know that’s not true, and that gameplay matters more. But at the same time, you can’t deny that you’d rather play a game that looks like Battlefield 4 than one that looks like Battlefield 1942, assuming that the gameplay was of roughly the same quality (which, let’s be fair, it often is). Developers and publishers know this, and so every time they release an upgraded version of a title (Battlefield and Call of Duty are great examples of this), one of the biggest things that gets highlighted is new graphics, or a new engine.

Now that games aren’t able to consistently deliver that graphical upgrade anymore, gamers are rightly feeling a little let down, and the publishers have no one to blame but themselves. But instead of having a realistic discussion about how hardware has begun to plateau, we see comments like those from Far Cry 4 Creative Director Alex Hutchinson, who recently said, “It feels weird to me that people are cool about playing a sort of retro pixel game, and yet the resolution somehow matters.” Which is a fair statement, but kind of misses the point.

When you’re sitting down to play a “retro pixel game,” you’re inherently aware that the game isn’t going to be some sort of high-resolution masterpiece. You’re playing it for the gameplay, or the art design, or whatever you like. That’s not the case for current-gen, AAA releases. You’ve been bombarded with high-quality trailers with mind-blowing graphics, only to find that the reality is Assassin’s Creed: Unity in 900p. Your expectations have been built up by the marketing thrown at you for over a decade, and now the publishers who did it say that resolution doesn’t matter. Is it any wonder gamers are frustrated?

The exponential gains we’ve seen in the hardware of past generations simply aren’t there for the taking anymore, and that’s a reality to which developers, publishers, and platform owners are going to have to adjust. If they want to put the “It’s not 1080p” discussion to rest, it’s time to stop showing off trailers that games can’t possibly live up to. It’s time to stop making excuses like “30 frames per second is more cinematic.” It’s time to admit that it’s not necessary to try and overwhelm gamers with graphics, because you can’t do it anymore.

If you, the developers and publishers, really want your games to be “next-gen,” then you’re going to have to do it with gameplay. Middle Earth: Shadow of Mordor takes a step in that direction with its Nemesis System, but that’s just an inkling of what could be possible if developers commit to it.

Let’s focus more on the games. After all, you’re all saying it’s about gameplay now, right? Time to prove it.