NVIDIA has sneakily enabled DirectX 12 on Fermi based graphics cards, which includes the GeForce 400 and GeForce 500 series, with their latest GeForce 384.76 drivers. I was going to write a simple news story but we had an old Zotac GTX 580 lying around and since that is the flagship of the Fermi lineup I thought it would be a good idea to take it out for a spin using NVIDIA's new DX12 capable drivers and write something a little more fun.

NVIDIA GTX 580 DirectX 12 Gaming Performance Benchmarked Under GeForce 384.76 Drivers

I have vocally protested NVIDIA failing to make on the promise of bringing DX12 to the Fermi GPUs many times in the past, so kudos to them for finally taking the plunge. Of course, DX12 capability and DX12 optimization are very different things. While Direct3D 12 is indeed working on Fermi GPUs from 384.76 onwards, it does not appear to be optimized - and I doubt NVIDIA will waste resources on such legacy support for DX12. Without any further ado, lets get to the benchmarking.

The test system that I used was a Sandy Bridge one with the following specifications:

CPU: Core i7 2600K (4.2 GHz on all cores)

Motherboard: AsRock Z77 Extreme 6 TB/4

RAM: 8 GB G.SKill RipJaws 1866 MHZ DDR3

SSD: Samsung EVO 512 GB

GPU: Zotac GTX 580 1.5 GB ( 850 MHz / 1002 MHz)

Drivers: GeForce 384.76

3DMark TimeSpy (DX12) and FireStrike (DX11)

The first thing I did was run 3DMark TimeSpy. This is going to be one of the most demanding DX12 applications you can run on an old card primarily because of the heavy memory usage involved. The meager 1.5 GB of vRAM that the GTX 580 possesses will almost certainly bottleneck results.







The GTX 580 managed to score 786 Points in the TimeSpy benchmark (for reference purposes my ASUS STRIX GTX 1070 scores around 6113 points). Needless to say, this is sub-par DX12 performance but something that was completely expected from such an old card. I also ran the FireStrike benchmark and here the oldie fared much better. It scored 4806 Points (my GTX 1070 scores around 18,478) which is pretty decent DX11 performance considering the age of the card. So basically, in DX12 synthetics, the 580 got approximately 1/8th of the performance of the GTX 1070 whileas in DX11 synthetics it was able to achieve 1/4th the number. In other words, DX12 performance, as expected is not optimized on the Fermi cards - not to mention the huge bottleneck the small vRAM presents.

DirectX 12 vs DirectX 11 Performance Comparison

The only DX12 game I had installed on this test bench was Battlefield 1 so we booted it up and got it running on the 1080p High preset. The GTX 580 had some pretty impressive performance and easily able to achieve a playable frame rate of around 38 FPS with minimum and maximum being 35 FPS and 43 FPS respectively. Even in this DX11 mode, Battlefield 1 looked absolutely gorgeous.

After turning DX12 on, FPS took an instant hit with no major quality improvement. The minimum FPS dropped to 25 and the maximum was around 34 FPS. The GTX 580 managed to maintain an avg FPS of around 28 FPS, which is barely playable and just above the mark where stuttering can become visible. Gamers that are used to playing at very high frame rates can drop the quality setting to medium to make the game much more playable. This means that the graphics card is able to sustain a minimum 1080p 30 FPS standard in DX12 assuming your quality settings are in the medium range in AAA games.

GTX 580, AAA Titles Performance (DX11), GeForce 384.76 - The fallen king holds its own in 1080p gaming

The DirectX 12 portion of the review was technically over with that test but I couldn't help but load up some more AAA titles and try them out for the sake of completeness. I got some very interesting results. It would appear that even though 7 years has passed since its release, the GTX 580 can still hold its own in 1080p gaming - it was after all, the flagship GPU of its time.

The GTX 580 manages to reach very playable frame rates with the High preset in most AAA titles (and the 1080p resolution). In fact, the only game in which I had to drop the quality preset to Medium was Mass Effect: Andromeda. In Mirrors Edge Catalyst, at the High Preset, the GTX 580 can sustain an avg FPS of 40 FPS, with min and max being at 36 FPS and 46 FPS respectively. In Mass Effect Andromeda, with the Medium Preset, the card can sustain an avg FPS of 42, with min and max being 35 and 49 respectively. I have already talked about Battlefield 1 but the GTX 580 can sustain an avg FPS of 38 with min and max being 35 and 43 respectively.

As a bonus, I have also included the screenshots of how the games look at these settings: