It's still Call of Duty, but how fine it looks. On all formats, Advanced Warfare brings the generational leap we wished we had seen in Ghosts - one that benefits not only PS4 and Xbox One versions but also the maxed-out PC release. Post-processing ingenuity, a rebuilt lighting model, plus the use of best-in-class motion-capture tech make this the most photo-realistic entry yet. But while each version has its advantages, exactly what improvements does PC itself bring to the table and what hardware does it take to run?

First off, let's address some unfinished business on console; namely the resolution issue. For campaign only, a majority of the gun-toting action runs at 1360x1080 on Xbox One, while PS4 charges in with a full 1920x1080. However, courtesy of a dynamic frame-buffer, there are specific areas designed to render at a full, true 1080p on Xbox One - given the head-room. For example, the interior of a besieged Atlas control room runs at this lower rate in the Fission stage, while the bus-hopping set-piece, closing the Nigeria level, runs at the full value.

The transition itself is never obvious. But after sampling as many static shots we could find, we've yet to encounter any horizontal pixel-counts in between 1360 and 1920. Advanced Warfare may briefly flick past intermediate numbers, but in practice the game is mostly rendered at one of these two resolutions; the higher mode kicking in when it can be afforded. It's unlike the dynamic model as seen in the likes of Rage or Wipeout HD - where pixel counts scale across a range of middle values based on the on-screen action.

For Xbox One's multiplayer, the scenario is different. The 1360x1080 resolution is a stuck fixture in this mode, with the 1920x1080 pixel boost simply never kicking in as it does in campaign - even playing on small maps. Having tested all 13 available stages, the result is always the same; Xbox One sports a blurrier presentation than PS4's, particularly across distant textures and transparency effects. Added to that, the cut-down post-processing effects for multiplayer, such as motion blur, makes the gap in console resolutions easier to catch in competitive modes.

This content is hosted on an external platform, which will only display it if you accept targeting cookies. Please enable cookies to view. Manage cookie settings The PS4 and Xbox One versions of Call of Duty: Advanced Warfare put side-by-side. Below are matching comparisons with the PC release at maximum settings.

Alternative comparisons:

Through our like-for-like captures of PC, PS4 and Xbox One, we're able to see how image quality holds up in campaign mode across all three. The PC release is replete with modes to tackle aliasing, including FXAA, a SMAA setting that cranks up to T2X, and a 'filmic' variant of SMAA that changes its effective threshold when in motion. As a costly alternative, we have the option of super sampling too - where much higher resolutions are downscaled to the display output, reserved for cards with high memory bandwidths. For our tests, we opt for a mixture; 2x SSAA and also SMAA T2X for strong, all-round coverage of each frame, while both PS4 and Xbox One appear to utilise the lesser FXAA post processing.

With so many post-processing effects on the go, it's a challenge to pick out many visual contrasts. The Xbox One's deficit in pixel-count is, to start, far less apparent in campaign as a result of this deluge of effects - unlike the state of affairs in multiplayer. However, on its default brightness setting (matched at 3.3 notches with the PS4 and PC), we notice a native black crush affecting visibility in dark areas. It can be rectified, to an extent, by cranking the in-game brightness up by four points in Advanced Warfare's menu, but the image begins to wash out if we go any further. Not ideal. In our initial campaign performance analysis, we found an area where PS4 has pared back shadows - however, this appears to be a one-off, with every other comparison demonstrating parity in this area.

On PC, we're handed a bevy of graphics options to tinker with. At the top, we get a field of view (FOV) selector, affording us a range between 60 and 90 - where consoles are fixed at a strict 65 degrees. However, in the interest of framing cut-scenes correctly in campaign mode, the PC is fixed at this console value, with no official means to broaden it.

Between PS4, Xbox One and PC, texture detail on the lavish Instinct stage is a match for its extra setting. The only let-down on consoles is the use of what is equivalent to the PC's low anisotropic filtering setting. Shadow draw distance is curtailed on consoles too, as seen around the rocky enclosure at centre-back. And what of image quality? Compared to PC running with 2x SSAA in concert with SMAA T2X, the PS4 holds up well enough with its FXAA. The Xbox One uses the same post-process method as PS4 - but its 1360x1080 native resolution pales next to the full 1080p of the other two. Shadow filtering on consoles suffers compared to the maxed PC experience, and even culls entire blankets of shade on distant buildings - such as the structure to the right. PS4 and Xbox One have much in common with the PC's normal quality shadow map setting, matching that preset's shadow draw distance. Subsurface scattering makes it into the console releases, allowing overlapping shaders to operate in tandem to produce an opaque effect on skin. This allows light to interact more realistically, creating a more convincing, pink glow to faces in daylight. Reflections are a match on all three platforms, except inside Atlas' labs in the campaign. It's a rare spot where these light details appear sharper on PC. To the naked eye, it's difficult to pinpoint the quality grade of depth of field or motion blur used on consoles. Next to the PC on its maximum settings, these effects appear almost identical in static shots, save for the extra use of DOF on the crane to the back here. Alpha effects are also a very tight match on PS4, Xbox One and PC. The maxed-out PC setting uses a horizon-based ambient occlusion approach to shading known as HBAO+. In effect, this produces subtler increments of shade around bends in character models, and near background objects. By comparison, the PS4 and Xbox One approach is a more heavy-handed, falling closer to the PC's high setting for ambient occlusion. The lighting model is unchanged, and character geometry is identical between all three releases. And once again, to the top-right we see normal map quality is like-for-like between all three.

Surprisingly, the console versions rank closely next to the PC at max, but miss out on a few visual treats. Looking at the graphics settings, texture and normal map options are present, where the PS4 and Xbox One impressively match the highest extra setting PC has to offer. However, for the interiors of Atlas' labs, specular mapping runs at one notch lower on console, producing less defined spotlight reflections across its glossy floors.

Frustratingly, while console texture quality runs at this premium grade, the anisotropic filtering backing it up is set to an equivalent of PC's low setting. As a result, we get an obvious blurring to angled surfaces just a few metres away, where PC's remain crisp far into the distance. Likewise, this weaker filtering has an impact on shadow aliasing a few paces ahead on console.

But between Xbox One and PS4, shadow quality is largely indistinguishable. Each use the same dithered patterning for dynamic shadows on characters and environments; an artifact seen outdoors on PC too. But to the PC's advantage, at maximum settings we have HBAO+, delivering subtler pockets of shade around, for example, the bends in Jack Mitchell's fingers and the cracks in walls. The consoles deliver a cheaper screen-space variation of this same effect, tending to exaggerate depth with thicker plumes of shade.

This content is hosted on an external platform, which will only display it if you accept targeting cookies. Please enable cookies to view. Manage cookie settings Our brand new PS4 vs Xbox One frame-rate tests shows footage from a bit further into the campaign. Both platforms show drops at synced points here, but as before the PS4 takes the bigger hits - while Xbox One drops to a lesser extent with a touch of screen-tear in tow.

Additional analysis:

The PC release also excels in shadow draw distances; shadow maps on buildings rendered from a further distance in the Nigeria stage, for example. Otherwise the console versions get the full deal - up to and including subsurface scattering on characters, and full-resolution alpha. Curiously, effects used for waterfalls, fountains and flames are rendered at 30fps, even on the highest PC setting - a tad distracting when playing at the game's intended 60Hz output.

To state the facts on performance, the PS4 campaign mode drops at lowest to 46fps in our newest tests, but typically sticks between the 50-60fps lines when threatened with heavy alpha. From shielding a drone swarm to riding hover-bikes through a warzone, these peak stress points simply manifest more heavily on PS4's campaign. In matching set-pieces, Microsoft's platform delivers the smoother frame-rate on average, but does so while introducing tears to the top third of the screen.

The story is different in multiplayer. Here, both PS4 and Xbox One keep a keen grip on the 60fps line. Pushing each version with an 18-player Ground War battle on the demanding Instinct stage, we manage minor drops when blasting the EM1 laser rifle on Xbox one - and the only hiccups on PS4 come about during kill-cam replays. The take away: performance is better tuned for the multiplayer experience, and frame-rate metrics hold very strong on both Sony and Microsoft hardware.

This content is hosted on an external platform, which will only display it if you accept targeting cookies. Please enable cookies to view. Manage cookie settings Our multiplayer frame-rate test on PS4 shows a sturdy 60fps performer, quite unlike the campaign. Click below for the equivalent Xbox One analysis.

Additional analysis:

But what of PC performance? Right away, we're pleased to see Sledgehammer Games factoring in a range of setups. For example, at the top-end, a Core i7 3770K system with 16GB of RAM matched with a £350 GTX 780 Ti is capable of 60fps in both campaign and multiplayer at max settings. This is with 2x SSAA and SMAA T2X enabled, though adding 4x SSAA knocks the read-out down to the 35-45fps range for campaign cut-scenes. In this case, actual gameplay (such as the later encounter with a drone swarm) tends to run at between 45-60fps.

However, top-end GPUs do seem to be spending most of their resources rendering at extreme resolutions as opposed to generating additional effects. At a straight 1080p, you can get some great results even with low-end hardware. For example, the much cheaper GTX 750 Ti - at present costing around £100 or lower - capably runs the opening Seoul stage at 50-60fps with max settings engaged, even when paired with a relatively modest Core i3 processor. That said, the taxing 2x SSAA is exchanged with SMAA T2X to tackle the rough edges, but otherwise, the settings are a match. That's phenomenal performance for entry-level enthusiast GPU technology, while we find that the £150 GTX 760 and the £130 Radeon R9 280 can comfortably keep you at a maxed 1080p during gameplay with just the odd wobble in cut-scenes. However, using an AMD card appears to incur an additional CPU load you simply don't get when gaming with an Nvidia equivalent.

You can see that below as we run a Core i3 4130 at stock speeds compared with a Core i7 3770K overclocked to 4.3GHz. There's little difference in the performance results posted by Nvidia's GTX 760, while the Radeon R9 280 - a more capable card overall - has a firm lead when powered by the i7, but collapses significantly in draw-intensive areas when paired with the i3. We're still looking into PC performance, but it hasn't been easy - as much as we love the optimisation that's gone into this version of Advanced Warfare, loading times are horrific (even using an SSD) while adjusting settings causes yet more reloading. It's extremely difficult to fine-tune your presets when you're spending most of the time looking at loading screens.

A Core i3 4130 is a budget £80 CPU that runs a GTX 760 nicely - just a little slower than an overclocked Core i7. However, check out the AMD results - an R9 280 running with the i7 rules the roost, but there are dramatic dips in performance when the same GPU is paired with the i3. We'd recommend a quad-core Intel CPU with enthusiast-level AMD graphics cards.

However, we can report that we see the same issue to different degrees on other AMD GPUs too, though the lower down the stack you go, the less noticeable the impact (as you're hitting GPU, rather than CPU limitations) - as we saw when testing with an R7 265, essentially an overclocked version of the classic Radeon HD 7850. Generally speaking though, if you're running anything at the R9 270 level or better, we'd recommend an Intel quad-core processor or the equivalent for best performance with an AMD card, while a Core i3 is potent enough to deal with Nvidia equivalents in this performance range.