The Evil Within and Shadow of Mordor are both asking for insane amounts of VRAM.

WTF is going on?

It comes down to how the Xbox One and PS4 are architected, and how games built for it are being ported to the PC.

The next generation of consoles aren’t that powerful. The AMD CPU and GPU combination that powers them both are basically cut-down versions of the Radeon 7870 card, a card you can buy for about $130.

But there’s a difference. The consoles have a pool of RAM, 8 gigabytes, that can be shared by both the CPU and graphics card. PCs tend to have similar, or larger total memory, but that RAM cannot be pooled. Your VRAM and your system RAM are not additive.

On consoles developers are free to use their RAM allocation as they see fit. Understandably, that means increasing the size and the quality of the games video assets. They want to get the best from the resources in front of them. On PC, care must be taken that the total rendered frame does not exceed the total VRAM.

Your PC might have 8 gigabytes of system RAM, and 1GB of VRAM, but that isn’t accessible to a game’s renderer without some complicated, and slow, movement of data around the system. Moving data between the GPU and the system RAM is far slower than moving data between the VRAM and the GPU. To get around this, developers can compress, shift, delay and move data around to avoid the hard limit, but eventually, they will run up against it.

You can blame the PS4, if you like. Sony’s decision to deliver a console with 8Gb of shared GDDR5 (read: very, very fast) memory will, over the next few years, create a clear space between their platform and the Xbox One. Developers are going to absolutely gorge themselves on that memory.

They may already be doing so.

What I, and a few friends suspect is going on, particularly with The Evil Within, is that the game has been developed primarily for the consoles, and the PC port is probably a bit of an afterthought. Over the last few years, cross-generational games like Assassin’s Creed: Black Flag, or FIFA, or even Call of Duty, have benefited from the range of work artists and developers are required to do to cover last-gen ports. Essentially, the PC version could pick and choose from the textures built for both the PS3 and PS4 versions of the game. The Evil Within is not being developed for previous gen consoles. It’s just for the Xbox One and PS4. Meanwhile, survival horror games have traditionally not done exceptionally well on PC.

It may be that Tango Gameworks, with their console background, decided that creating lower resolution textures, or the optimisation and compression work required to get the game looking great on lower spec PCs simply wasn’t worth the return on investment. I’ve asked to chat to someone from Tango, but haven’t yet had a response.

Mordor, meanwhile, is a more interesting case. Shadows of Mordor wants 6GB (!) of VRAM to run with a freely downloadable, optional ultra texture pack. Monolith have a PC heritage, understand the size of the PC audience, and have done the optimisation and texture work to make it work. They’ve then gone above and beyond to essentially say: “here’s a version of the game, the Ultra version, which is an amalgam of the Xbox and PS4 memory usage, plus very high-end textures. Here you go: see if it brings your PC to its knees.”

So how big a problem is this VRAM stuff?

Lets look at the last Steam Hardware Survey results. Only 1.6% of the sampled Steam user-base holds 4GB of VRAM – the recommended requirements for The Evil Within.

So how do we fix it?

The simple answer is you can’t. Don’t think doubling up GPUs, via Crossfire or SLI, and theoretically doubling the VRAM will solve the problem. VRAM cannot pooled in that way.

Now, this won’t be as much of a problem for games built for the PC first. It’s only going to show up when games are architected for the consoles, or more likely, the PS4, first.

The only real option is for Nvidia and ATi to increase the VRAM available on the cards they’re selling, and do it quickly. Note that even a mid-range card, like the 780GTX only has 3GB addressable VRAM. If you’re in the market for a new graphics card, look very, very carefully at, not just the synthetic benchmarks, but at the real-world results of the card under stress in modern console ports.

It’s worth considering upgrading to Windows 8.1. In the latest version of Windows, DirectX was updated to allow for tiled resources: a bit of a hack that helps in texture streaming. Again: wait for any benchmarks to test any performance disparity.

I’d also hold off on buying any major console ports until at least a few days after release. We regularly run inspections of a quality of a port, and we’ll continue to do so.

These are not easy problems to solve, particularly as developers start cutting support for the last generation. I think we’re going to see some wild benchmarks, particularly from this year’s selection of open world games. Digital refunds are rare. Buyer beware.

And prepare for the drama llama.

With thanks to Adam Oxford