Early testing for higher end GPUs

We tested a handful of AMD and NVIDIA graphics cards in the brand new Rise of the Tomb Raider released this week!

UPDATE 2/5/16: Nixxes released a new version of Rise of the Tomb Raider today with some significant changes. I have added another page at the end of this story that looks at results with the new version of the game, a new AMD driver and I've also included some SLI and CrossFire results.

I will fully admit to being jaded by the industry on many occasions. I love my PC games and I love hardware but it takes a lot for me to get genuinely excited about anything. After hearing game reviewers talk up the newest installment of the Tomb Raider franchise, Rise of the Tomb Raider, since it's release on the Xbox One last year, I've been waiting for its PC release to give it a shot with real hardware. As you'll see in the screenshots and video in this story, the game doesn't appear to disappoint.

Rise of the Tomb Raider takes the exploration and "tomb raiding" aspects that made the first games in the series successful and applies them to the visual quality and character design brought in with the reboot of the series a couple years back. The result is a PC game that looks stunning at any resolution, but even more so in 4K, that pushes your hardware to its limits. For single GPU performance, even the GTX 980 Ti and Fury X struggle to keep their heads above water.

In this short article we'll look at the performance of Rise of the Tomb Raider with a handful of GPUs, leaning towards the high end of the product stack, and offer up my view on whether each hardware vendor is living up to expectations.

Image Quality Settings Discussion

First, let's talk a bit about visuals, image quality settings and the dreaded topic of NVIDIA GameWorks. First, unlike the 2013 Tomb Raider title, Rise of the Tomb Raider is part of the NVIDIA "The Way It's Meant To Be Played" program and implements GameWorks to some capacity.

As far as I can tell from published blog posts by NVIDIA, the only feature that RoTR implements from the GameWorks library is HBAO+. Here is how NVIDIA describes the feature:

NVIDIA HBAO+ adds realistic Ambient Occlusion shadowing around objects and surfaces, with higher visual fidelity compared to previous real-time AO techniques. HBAO+ adds to the shadows, which adds definition to items in a scene, dramatically enhancing the image quality. HBAO+ is a super-efficient method of modeling occlusion shadows, and the performance hit is negligible when compared to other Ambient Occlusion implementations.

The in-game setting allow for options of Off, On and HBAO+ on all hardware. To be quite frank, any kind of ambient occlusion is hard to detect in a game while in motion, though the differences in still images are more noticeable. RoTR is perhaps the BEST implementation of AO that I have seen in a shipping game and thanks to the large open, variably lit environments it takes place in, seems to be a poster child for the lighting technology.

That being said, in our testing for this story I set Ambient Occlusion to "On" rather than HBAO+. Why? Mainly to help dispel the idea that the performance of AMD GPUs is being hindered by the NVIDIA GameWorks software platform. I'm sure this won't silence all of the conspiracy theorists, but hopefully it will help.

Other than that, we went with the Very High quality preset, which turns out to be very strenuous on graphics hardware. If you don't have a GTX 980 or R9 390 GPU (or better), chances are good you'll have to step down some from that even at 2560×1440 or 1920×1080 to get playable and consistent frame times. Our graphs on the following pages will demonstrate that point.

Testing Setup

For this short sample of performance we are comparing six different graphics cards with matching prices points from AMD and NVIDIA.

$650 NVIDIA GeForce GTX 980 Ti 6GB AMD Radeon R9 Fury X 4GB

$500 NVIDIA GeForce GTX 980 4GB AMD Radeon R9 Nano 4GB

$350 NVIDIA GeForce GTX 970 4GB AMD Radeon R9 390 8GB



I tested in an early part of the Syria campaign at both 2560×1440 and 3840×2160 resolutions, both of which were hard on even the most expensive cards in the comparison. Will the 6GB vs 4GB frame buffer gap help the GTX 980 Ti in any particular areas? How will the R9 390 with 8GB of memory compare to the GTX 970 with 4GB configuration that has long been under attack?

This also marks the first use of our updated GPU testbed hardware, seen in the photo above.