

It surprises me when so many gamers complain about the prices of consoles and their accessories, and it’s not the “casual gamers” that I hear crying. It’s the ones that demand a smoother gaming experience from console manufacturers while throwing above $1000 in their PC for the sake of “true gaming experiences”. Up until a few weeks ago I kept asking myself why consoles are even a viable part of video games today. I mean really, in comparison to the performance of consoles it’s obvious that PCs are superior.. So that absolutely could not be the reason. This is magnified by the test of time that consoles are constantly struggling against, and in a world where technology is advancing in a way that is more insidious than last generation that issue will only continue to get worse. As far as game libraries go, it is debatable that PCs also have the upper hand; since programs like Steam or (dare I say) Origin host a large variety of games that span most consoles through 3rd party development. So then, what do consoles have that is so immensely important to the gaming industry that the video game world as we know it revolves around its future? It’s simply accessibility.

PCs are vicious monsters that take pride in the horrific greed that they hold deep within their protruding guts, but they clean up well. To that comparison, consoles are kind of like the weak heroic knights that hide their vulnerabilities underneath their armor, but still kinda suck balls. Yet, consoles make up for it by providing a manageable bridge between the base consumer and that oh so glorious PC gaming. If it wasn’t for consoles this industry would not even exist, because PC gaming is not only complex to the average gamer, but also crazy expensive. Console to PC gaming is kind of like this really twisted Free 2 play psychology system. Every time I feel like I’m having a great time being a blind, but enthused, console gamer playing a console game, I’ll always get disturbed by just how much better this game that I’m playing looks on it’s PC port. So now you’re probably saying “but Josh! Graphics don’t matter!” Or “Keep lapping up those tears, you dirty graphics whore.”.. And I’d have to say that you’re probably right on both statements, but keep in mind that video games today focus on a cinematic presentation more so than they ever have before.

Take this example: A popular new movie has just opened up at your local cinema, but it just so happens that your local cinema is terribly run down, slightly overpriced, and doesn’t play the movie you want to see in 3D. Fortunately, the more financed cinema multiple miles away eliminates all of the problems that your older theater has, in addition to having IMAX capabilities. The only problem is that it may take a significantly longer drive to get to, thus costing you more in the long run, however the experience you’ll have while watching your movie will make this worth it. I don’t want to have an inferior experience. What is slightly twisted about this whole situation rests on the fact that many video games today depend on their cinematic presentations in order to deliver “fun” experiences, which is inherently false.. But.. Looking at the lineup of critically and commercially successful games in this past decade, I’m not so sure anymore. Games like Uncharted, Assassin’s Creed, Batman, Fable, and God of War are absolutely, 100%, indefinitely, absolutely defined by their presentation first and foremost. What is so perplexing about these games are that they are all regarded as generation defining titles, and although that may it true; it still does not make that a good thing. The gameplay in each of the previously mentioned titles just happens to be kinda poop. Imagine playing Assassin’s Creed as a colorless polygon, it would be an unplayable experience that literally jumps and progresses through the game for you with no sense of innovation or platforming. The same goes for the Uncharted series, where jumps and platforming puzzles are constructed simply for the purpose of delivering more “cinematic experiences” even though it is doing the exact opposite of its purpose. Cinematic experiences are exactly what video games should attempt to avoid in their development, it detains what makes video games interactive and unique. I mean, It takes the game out of video games.

Ever wonder why some retro games from the 80s and 90s manage to keep their fun and value? It is not because of cinematic experiences that’s for sure. 20 years from now the technology to further progress gaming will be leagues above the graphical fidelity that we see in AAA titles today, and when the pretty colors that we see in Fable all of a sudden start looking not so pretty, there won’t be much left to that game at all. I mean really, even by today’s standards Fable 2 is already looking rough, and don’t even get me started on how Fable 1 looks now. Which brings me back to my point, in a cinematic video game world, PCs unfortunately are the ideal way to play video games. It helps immerse the gamer into a world that has been developed specifically for that purpose, and I’m all for immersion, but come on AAA third party developers.. Give me something that can be just as amazing, regardless of it’s resolution.