This article was originally published January 2020 issue of PC Gamer magazine. For more quality articles about all things PC gaming, you can subscribe now in the UK and the US.

For almost as long as I can remember I've tried to push games past the limits of my PC. These days that means trying to run them at 4K on a graphics card that's not up to it. It means bringing framerates to a stumbling halt whenever anything interesting happens, never lowering detail levels beyond 'medium'.

It's a compulsion that goes back a long way. I was one of the lucky kids who got to use the PC their mum bought for word processing and 'homework' to play games on, a 286 with 1MB RAM, later upgraded to a 386 with 4MB, a CD-ROM drive and a Soundblaster 16 sound card.

On this latter machine I attempted to run Doom, a game that really wanted a 486. Graphics cards hadn't been invented yet, but neither had proper detail levels. Doom ran in hi-res and lo-res modes, and if you wanted to reduce the strain on that passively-cooled CPU any more you could reduce the size of the viewable area by literally walling it in, grey-green bricks filling the space between the game and the edges of the screen. This had the desired effect, but on a 14-inch monitor was a dismal experience.

So it was lo-res mode in fullscreen. I became intimately familiar with every frame of the shotgun reload animation while cacodemons jerkily canted left and right, their fireballs appearing halfway through their trajectories and taking me by surprise.

For the majority of the first episode—the shareware one—it was a playable experience, a few zombies or imps proving less of a strain on resources. Only when the Shores of Hell and Inferno were attempted did my mind turn to upgrades.

Stutter island

Another game, supplied on CD-ROM for a whizzy loading speed that gave no indication of the jerky horrors to come, was Syndicate. The signs were there from the start, as the intro took a deviation from smoothness. The camera lifted off the ground and the West Gate opened to reveal the Leonardo building... it all went by in three jerks. Looking at that intro movie today, though, it doesn't seem to run exactly smoothly for anyone.

Syndicate is a game in which four tiny men run around a city shooting other tiny men you need to identify as civilians, police, or enemy agents. The PC version was ahead of the Amiga in this respect—at least you could tell vehicles apart without having to look at their descriptions—but it meant the game had to be run at a decent resolution so all the important detail could be appreciated. This meant, when lots of firing was happening at once, or when bombs went off, it once again slowed to a crawl.

(Image credit: Bethesda)

But it didn't matter. I could still play the damn thing even if my friends were getting their jollies from something called 'Mario Kart'.

Automatic Execution The tuning of autoexec.bat files to squeeze a game into memory used to be a common thing. PC RAM was once split into conventional, expended, and expanded, and knowing what a game wanted in terms of each was key to getting it to run nicely. Autoexec files run at PC startup and contain commands such as LOADHIGH that would push device drivers and the like into higher memory, freeing up conventional memory for the game. Remember, 640KB is all the RAM you'll ever need.



You could also use them to set the interrupt requests and direct memory access used by sound cards—some games let you tweak these settings, others didn't mind what you used, still others demanded precise settings or they wouldn't work. See, in my day, you used to have to earn your gaming time. People these days don't know how good they've got it...

Playing on inadequate equipment somehow gets to the heart of PC gaming, the fiddling required—back then with autoexec files, these days with graphics settings—to get them running in a satisfactory way, maybe to get them running at all. Being prepared to put up with ridiculous framerates and detail levels rather than buy a PlayStation is surely the mark of the dedicated PC gamer... though perhaps not the most intelligent one. We play games this way because not only do we want the best looking games but because we love to tinker. And it's a trend that continues. A rumour came round that hardware transform and lighting could be enabled on the Voodoo 3 for a performance boost. I tried it—it didn't seem to make any difference. Morrowind whooshed past in the muddy brown haze that was the best a Geforce 4 MX could provide. However I played World of Warcraft on Core 2 Duo integrated graphics, surprisingly successfully.

This century, with various GTX 680s, 970s and a 1070Ti in my PC case, I still do it. 30fps holds no fears if it unlocks a higher detail level. 25fps might not be too bad... it's good enough for the movies. I keep a frame counter visible at the top of my screen to check things aren't plunging too low. And I'm very appreciative of games such as Gears 5 that manage settings for you, plugging in a resolution and a maximum frame rate all that's needed to get a bright, sharp 4K picture that doesn't judder too much. If it ever hits 60 I'm ecstatic, but it's much more likely to say 30, with drops to the 10s, because when there's an Ultra option, I can never bring myself to choose 'just' High.