The speed of innovation in the graphics technology spaces shows few signs of slowing down, and the recent releases of Nvidia's top-tier GTX 1080 and Titan X Pascal are highly significant - both of these cards are capable of handing in good performance at 4K resolution. However, there's still some uncertainty about just how powerful the new Titan X actually is and to what extent a locked 60fps is possible? We're into uncharted territory here: benchmarks only go so far and are highly limited in nature. What's the actual experience like?

First up, a bit of background. The next-gen consoles are seemingly targeting 4K displays, for a few reasons. Firstly, there's no denying that 4K UHD TVs are gaining momentum in terms of sales and dropping hard in price-point. Secondly, GPU technology is scaling, while CPU performance is remaining mostly static - for the consoles, this translates into a continuation of the status quo in terms of frame-rates. And finally, both Microsoft and Sony don't want to leave their existing userbase behind, so they are seeking to use image quality and resolution as the selling factor for their new hardware - not performance.

And that's not exactly what the core PC gamer wants. They want better visuals than console and higher performance. And that's exactly the premise on which we put together our 1080p60 gaming PC, pairing a Core i5 6600K with a GTX 1060. But the real question is - can we get a similar experience using PC hardware. Is the Titan X Pascal powerful enough to deliver both the image quality and resolution boost required?

This content is hosted on an external platform, which will only display it if you accept targeting cookies. Please enable cookies to view. Manage cookie settings A hands-on playtest with the Titan X Pascal on a bunch of demanding titles. This it the closest yet to compromise-free PC 4K gaming experience.

Now, of course, Nvidia's latest and greatest is supremely expensive at $1200. But this level of performance won't always be so prohibitively costly. There's a certain element of speculation here but expect a 1080 Ti with at least a third of the price lopped off within six months. And in a world where the GTX 1070 at $380 offers slightly improved performance compared to the last-gen Titan X, which originally sold for $1000, it's safe to assume that patience will reap rewards. And those rewards are significant.

Star Wars Battlefront locks to 4K at 60fps with headroom to spare at ultra settings on a Titan X Pascal system. The Witcher 3 - sans HairWorks of course - almost follows suit, dropping a few frames when heavy alpha effects work is in place (easily solved if it bothers you by dropping back foliage proliferation by 'one notch'). Grand Theft Auto 5 is also a silky smooth experience with all of its standard settings, bar MSAA and grass quality ratcheted up (again, dropping to the next preset down on the latter solved performance issues). Overwatch? No trouble whatsoever. In fact, a 4K60 lock is possible on GTX 1080.

Thus far, the experience - and the kinds of compromises required to achieve a locked 60fps - are similar to the settings tweakery faced by GTX 970 owners running at 1080p. And that's a significant comparison. Nvidia's last-gen card is generally considered to be a superb GPU for full HD action - the most popular graphics product for running at the most popular gaming resolution. To achieve something approaching equivalence at 4K is a genuine milestone.

This content is hosted on an external platform, which will only display it if you accept targeting cookies. Please enable cookies to view. Manage cookie settings Nvidia's GTX 970 is widely acknowledged to be a great 1080p card, so what happens if we compare its 1080p performance with Titan X Pascal's 4K frame-rates? It's an eye-opener.

GTX 970 1080p Titan X Pascal 4K GTX 1080 4K Assassin's Creed Unity, Ultra High, FXAA 51.3 43.1 32.5 Ashes of the Singularity, Extreme, 0x MSAA, DX12 40.5 63.7 53.6 Crysis 3, Very High, SMAA T2x 72.5 50.0 40.1 The Division, Ultra, SMAA 50.2 49.6 37.3 Far Cry Primal, Ultra, SMAA 56.2 54.7 42.4 Hitman, Ultra, SMAA, DX12 59.0 62.1 46.7 Rise of the Tomb Raider, Very High, High Textures, SMAA, DX12 69.7 57.1 44.8 The Witcher 3, Ultra, Post AA, No HairWorks 60.7 63.1 47.5

There are issues, however. Rise of the Tomb Raider piles on the effects work and while much of the gameplay remains at our target frame-rate, sustaining it completely is extremely challenging. We've even tried it with GTX 1080 SLI and are encountering similar issues: 50 to 60fps is the best we can do, dropping down into the 40fps area for cut-scenes, where only disabling Lara's 'PureHair' can improve matters. Similarly, Deus Ex gameplay at high settings (essentially equivalent to consoles) stutters between 48-60fps, generally feeling rather uneven. Scaling up resolution clearly introduces challenges on certain titles, but what's clear is that by and large, we have reached the point where 4K on Titan X Pascal is indeed equivalent to that ballpark GTX 970 experience.

Eager to see if my perceptions matched up with reality, I broke out the FCAT data. As the benchmarks rendered out, the results proved fascinating. Titan X is faster in Ashes of the Singularity and the Witcher 3 and average frame-rates are broadly equivalent in The Division, Hitman and Far Cry Primal. However, it's interesting to note that watching the benchmarks play out, the differential adjusts on a per scene basis - sometimes GTX 970 is faster, sometimes it's Titan X Pascal at 4K. The vast resolution delta incurs a very different load at any given point, a marked departure from the usual 'parallel lines' we tend to see in benchmark data taken at the same resolution on different cards. It's interesting to note that even in the title where GTX 970 is far ahead - Crysis 3 - it is possible to get a very good 60fps experience on our 4K set-up.

The bottom line is that the arrival of GTX 1080 and Titan X Pascal are significant in that for the first time, 4K gaming is something I actually want, where the experience comes across as a substantial upgrade over 1080p. There are two reasons for this - first of all, both GPUs are capable of providing a worthwhile experience at UHD, where I don't feel that the experience has to be unduly compromised to the point where the experience actually suffers - either in terms of frame-rate or quality settings. Titan X irons out performance issues and allows more titles to hit higher frame-rates, but GTX 1080 is still in the game.

This content is hosted on an external platform, which will only display it if you accept targeting cookies. Please enable cookies to view. Manage cookie settings Our first 4K hands-on demo with GTX 1080 - it demonstrated that we'd finally found a GPU that could support 4K gaming without too many compromises.

Secondly, while I still question the point of 4K resolution on a traditional 24 or 27-inch PC monitor, I've been using a 40-inch Samsung 4K UHD TV recently as a monitor replacement. To be frank, it's been a game-changer - pixel density is still immense, but I finally feel that I've got the extra size that makes the most of the resolution. In terms of physical dimensions, the Samsung HD screen is slightly wider than a 1440p ultra-wide display like the Acer Predator X34, but obviously it also has a lot more vertical real estate.

Rise of the Tomb Raider and Deus Ex: Mankind Divided demonstrate that even the Titan X Pascal finds its limits on some titles, but even here there is a workaround: UHD TVs are designed to work at both 50Hz and 60Hz (to accommodate content from different territories - we're still affected by the PAL legacy in Europe). Switch to the slower refresh and your GPU requirement for a locked, consistent, high frame-rate drops significantly. Frame-rate drops too of course, but gaming at a locked 50fps is still beautifully smooth - the consistency is there. At the end of the day, the difference between 50fps and 60fps is a mere 3.3ms per frame - 16.7ms vs 20ms - I can live with that if it eliminates stuttering performance. The downside is that stutter if you dip below 50fps feels markedly worse (v-sync stutter is 40ms vs 33.3ms), but the solution here is to mitigate it as much as possible by using Nvidia's adaptive v-sync control panel option.

As for the experience itself? Well, it's extremely impressive. Game-changingly so, in fact. The detail resolved is immense, and obviously in the desktop environment, view distance isn't really an issue - you're getting your bang for the buck. New games look beautiful, but even older titles produce some great results - it's been fascinating to replay titles like Crysis 3 and the Tomb Raider reboot in order to sample a 'UHD remaster'. Select older titles feel genuinely refreshed. The effectiveness of extreme resolution may well change for the negative in the living room (though DF's John Linneman is reporting superb results from his brand new 55-inch OLED screen) and it may well be further impacted by the upcoming console experiences, where upscaling along with other necessary compromises get factored into the equation. But in the here and now, I've finally found a 4K set-up that works for me - and it's stunning.