4K, or UHD as it's otherwise known, has come a long way since the Asus PQ321 monitor launched at an eye-watering $3500 (~£2300) back in 2013. These days, adding a decent 60Hz 3840×2160 monitor to your gaming setup can cost as little as £350 (~$500), provided you can deal with lesser TN panels over their mostly superior, but far pricier IPS and IGZO counterparts.

Aside from the benefits of a larger workspace, or increased sharpness with desktop scaling, the PC—and not the console—is the only place where you can game at native 4K, and at a distance where such a high resolution makes a visible difference.

While you can technically play almost any PC game at 4K, doing so is an enormous strain on resources. Despite huge advances in GPU technology, 4K is still very much the realm of the enthusiast, where £500 (~$700) graphics cards are all but required to play the latest games. That's not to say you can't play in 4K with a mid-range card, but it all depends on the sacrifices you're willing to make to rendering quality and frame rate in exchange for all those extra pixels.

Hitting the magical 4K at 60Hz

For a lot of people, myself included, part of the joy of playing games on a PC is a smooth 60 FPS-or-higher frame rate, something that the Xbox One and PlayStation 4 have famously had a hard time hitting at just 1080p. While budget GPUs like Nvidia's GTX 750 Ti do an admirable job of 1080p60 in many games, even high-end cards like the GTX 980 can struggle with 4K at 60 FPS. For a solid 60 FPS, an SLI or Crossfire setup is the way to go. The folks over at Digital Foundry tested high-end single-card solutions with a range of games at high and ultra settings, with only the $1000 Titan X managing to push an average frame rate of over 60 FPS in some games. Note that's an average frame rate. As a proud owner of a Titan X and a 4K monitor, I've experienced drops down as low as 30 FPS during gameplay.

Unfortunately, for fans of team red, AMD's high-end R9 290X is some way off even that level of performance, which is no surprise given the card is based off the company's rapidly ageing 2013 Hawaii architecture: only in Call Of Duty: Advanced Warfare does it manage to push over 60 FPS. Those wanting an AMD card for 4K would be advised to hold out for the company's long overdue architecture refresh, which is rumoured to be released later this year in the form of the R9 390 and 390X. The arrival of high-bandwidth memory (HBM) may possibly allow for single-GPU 4K gaming, but we'll have to wait until a new AMD graphics card arrives at the Ars Orbiting HQ for testing.

Stepping up to a Crossfire or SLI setup will net you the required performance for 4K gaming, but there are some trade-offs aside from the higher cost.

More GPUs means more heat to be funnelled out of your PC case, which can affect the performance of other components, particularly if you're using cards that vent air inside the case and you're not pushing enough air through the case to push the heat out. You'll also need to make sure your power supply can handle the increased power consumption of multiple GPUs, while also making sure your motherboard and processor supply enough PCIe lanes for the amount of cards you're using. This isn't really a concern for two or even three-way setups, but for those hell-bent on running four GPUs—despite the diminishing returns on performance—or M.2 SSDs, note that the latest Nvidia cards require at least an 8X slot in order to work in SLI.

At around £500 (~$620), AMD's R9 295X2 (which combines two R9 290Xs onto a single card) is the cheapest dual-GPU option, and it works wonders in 4K for games that are Crossfire-optimized, although you'll have to make room for its 30.7cm length and 120mm watercooling radiator, as well as adhere to strict power supply requirements for its monster 500W TDP. A pair of GTX 970s will net you similar performance for around £530, and with only a 165W TDP per card. Just bear in mind that its famous VRAM issues do begin to have an effect when gaming at 4K—resulting in a larger frame-time variance and thus the dreaded microstutter—thanks to the far larger textures being loaded into memory.

There are some steps you can take to increase performance, though. Turning down texture resolution and expensive effects like HBAO+ will naturally net you a few extra frames. If you're not keen on dropping the image quality, then switching off antialiasing is a good option. Not only is AA one of the most expensive post-processing effects you can use, it's also one of the least noticeable when gaming at 4K, thanks to the intrinsic effect of high pixel density smoothing out jaggies.

If you can live with 30 FPS, then the likes of a single GTX 970 or higher paired with a decent quad-core CPU and fast SSD will do the trick.

What should I look for in a 4K monitor?

While you won't currently find a panel that runs above 60Hz (at least until they start to come equipped with DisplayPort 1.3 or higher), you can take your pick from a range of manufacturers and panel types.

Generally, TN will give you the best grey-to-grey response times in exchange for poorer colour reproduction and viewing angles, while the opposite is true for IPS, which'll give you great colour reproduction and viewing angles but higher response times.

Keep an eye on inputs too: until recently few 4K monitors came equipped with the HDMI 2.0 ports required for 60Hz. This isn't so much of an issue for the PC, where DisplayPort is king, but it's worth thinking about if you you'll want to add a 4K streaming box or similar down the line.

Look out for 4K monitors that might actually be two 1920×2160 displays stitched together. This was common in the early days of 4K when the requisite scalers weren't available. While these monitors work well, because they make use of DisplayPort's Multi-Stream Transport (MST) standard, it cuts down on the number of monitors you can attach to your PC. They also don't behave well with games that fix menus and UI elements to a particular display, where they end up looking squished.

Finally, there's variable refresh rate technology to consider. This matches the refresh rate of the display to the frame rate of your game, eliminating artefacts such as screen tearing when v-sync is turned off, and judder and input lag when it's turned on. Nvidia brands its tech as G-Sync, while AMD's is called FreeSync. Both have their positives and negatives, but on the whole, Nvidia's tech results in an overall smoother experience, thanks to the way it handles lower frame rates. Given how difficult it is to run 4K games at 60 FPS, variable refresh rate technology can provide excellent results, giving you much smoother gameplay when running above 30 FPS but below 60 FPS.

At the moment, the only 4K monitors you can buy are G-Sync models, but Samsung is due to release a range of 4K monitors with FreeSync over the coming months.