Four years is a long time by any measure, and in gaming terms, that’s a literal eternity. But four years ago, I remember huddling with a friend at his place on a rainy evening and streaming The Samaritan tech demo. Even if the HD video stopped to buffer every ten seconds on the 512 Kbps connection, I distinctly remember my jaw dropping to the floor. And then some. It was epiphanic. This was it. This was the future. This was next gen, rendering real-time at glorious 60 FPS on triple GTX 580s in three-way SLI. I didn’t have a PC of note back then, and coming from Just Cause 2, Infamous, and Battlefield: Bad Company on PS3, The Samaritan was as big a perceived leap in graphics quality as those seventh-gen titles were over the PS2. Although we saw Infiltrator later, and the Elemental demo for Unreal 4 right before the PS4 and Xbox One hit the market, The Samaritan had always set my standard of expectations for the next gen.

Infiltrator: This is what next-gen is supposed to look like

Now, I hate to be a doomsayer but I don’t think we’re going to see even a single multi-platform title this generation that approaches the bar set by The Samaritan in terms of graphics fidelity. Those graphics are, in all likelihood, going to be out of reach for years and the most disappointing part of it is that it’s not even because the hardware’s isn’t available. It’s simply because Sony and Microsoft (particularly Microsoft) decided to stuff bargain basement parts into little black boxes and call that a generational upgrade.

The kind of generational leap we saw between the PS2 and the PS3, or even the original Playstation and the PS2 has categorically not happened this time.

PS2 to PS3 on the left; PS3 to PS4 on the right: The first is a real generational leap. The second, not so much.

Nine years ago, the PS3 was a pretty awe-inspiring piece of kit, whichever angle you looked at it from: The 8-core Cell processor, powerful enough that the US Air Force used arrays of PS3s to run simulations, or the RSX for graphics, upper mid-range for 2006. Consoles in 2006 were a definite value proposition—it cost Sony approximately $800 to build a PS3 around launch. They were being sold at a loss, the idea being that game sales would recoup the loss and turn a profit over time. Back then you literally had to build a $1000 monster rig to best the consoles. Today, that value proposition’s been turned on its head. You can build a PC that will best the PS4 and Xbox One for as little as $400. Even the Alienware Alpha, with a laptop GPU and an i3 outdoes the consoles, but numbers aside (although those numbers are pretty damning by themselves), what does all this mean for gaming?

CD Projekt Red’s Marcin Iwinski said that The Witcher 3, a 2015 title, is “close to maxing out the consoles,” and that about sums it up.

The PS3 and Xbox 360 have been going strong all these years precisely because they were high-end machines at launch. That’s why devs still manage to squeeze out that little extra bit of juice to give us wonders like GTA 5 (which you should totally check out on PC over here). Not so with the “next gen.”

With their lower mid-range GPUs and truly appalling processors (in specific number-crunching areas, the PS3’s cell actually beats the PS4), these consoles are going to run into a brick wall any time now. By some accounts, they have already. The PS4 and the Xbox One are supposedly “1080p native” consoles, but even at launch, many titles ran at 900p or, in the case of the Xbox One, even 720p. They called it ‘Resolutiongate’ and there were any number of threads and articles online debating the relative merits of 1080p versus 900p, but the bottom line is this: eighth generation consoles lack the hardware grunt to run even launch-era titles at native resolution. And that is very, very bad news going forward.

Each console generation has certain defining traits that categorically set its games apart from the previous generation, because these are things the previous generation simply could not do. In the seventh gen, almost all PS3 and Xbox 360 titles did HDR lighting, dynamic shadows, and ragdoll physics at the very least. For eighth gen, PBR (Physically Based Rendering) lighting is supposedly one such defining trait, but the few games that have actually implemented it, such as AC Unity and Ryse, are plagued by performance issues. Next-gen graphics don’t count for much if your game’s running at 20 FPS.

AC Unity delivers next-gen lighting—though it barely runs on consoles

In the end, the only people who are going to suffer are gamers. Those on consoles are going to be disappointed when games in 2016 and 2017 and 2018 look…more or less the same as they have for years. But as is often the case, PC gamers are going to get the short end of the stick. The irony is that, in spite of behemoths like the GTX 970 already at affordable price points and DirectX 12 right around the corner, PC gamers have so much extra horsepower on hand right now. But unless you like to game at ludicrous resolutions on 4K or multi-monitor displays, all of that extra horsepower is going to be sitting idle for the foreseeable future. Devs who used to be PC-centric, like CD Projekt Red and Crytek, are increasingly shifting toward multi-plats. Because of the laws of economics, games have to work on the least common denominator—the consoles—and that means accepting all the inherent limitations.

All those years ago when I saw The Samaritan for the first time, I was hoping for a next-gen miracle, for real graphics, real environments, and at least slightly smarter AI than this. I truly was. But with eighth-gen consoles the way they are, The Samaritan is little more than a far-off dream. But hey, on the plus side, if you’re planning on buying a high-end graphics card this year, you probably won’t have to upgrade until sometime in the 2020s if you’re content with 1080p/30.