When we tested the Destiny 2 beta back in August, we found a game that delivered on its promise of high framerates and resolutions for PC gamers. In the ensuing months, Bungie, AMD, and Nvidia worked to further tune performance, though most of the gains were relatively small. GTX 1080 Ti performance improved by as much as 20 percent, but only at 1080p—1440p framerates were within a few percent of our beta results, and 4k performance was basically tied. GTX 1070 and GTX 1060 3GB results in contrast were mostly unchanged, regardless of resolution.

Yesterday, Nvidia released its new 388.31 drivers, which are Game Ready for Battlefront II, but Nvidia also made some bold claims about improved Destiny 2 performance. In its own internal testing, Nvidia reports gains of up to 53 percent in some cases, with 30 percent more larger increases with nearly all of the cards tested at both 1440p and 4k resolutions. That's an extremely impressive jump in performance, assuming it's actually true and not limited to edge scenarios.

I set about testing with Destiny 2, using the same benchmark sequence I used in the beta. It's the very first part of the single-player game, which appears to be more demanding than many other areas—limited testing in the EMZ for instance showed 10-20 percent higher framerates. All the flames and other pyrotechnics of the opening sequence appear to be keeping performance in check. In other words, this sequence gives a good baseline for performance improvements, though other parts of the game may show larger or smaller improvements.

Image 1 of 4 Image 2 of 4 Image 3 of 4 Image 4 of 4

By the numbers, GTX 1080 Ti performance improves by 35 percent at 1440p and 41 percent at 4k. The GTX 1070 improves by 30 percent at 1440p and 35 percent at 4k. And the GTX 1060 3GB—a card that really has no right to perform this well at higher resolutions—improves by 27 percent at 1440p and 34 percent at 4k. That's enough to almost reach 30 fps at 4k, which is what console gamers typically get. The new drivers also push the 1070 well above 60 fps at 1440p, and the 1080 Ti now easily handles 4k, whereas previously both were a bit off the mark, particularly with regards to minimum fps.

Besides 1440p and 4k, I also ran tests at 1080p medium and 1080p highest. Surprisingly, there's no improvement at 1080p, and in some cases performance is actually worse. I can see the 1080 Ti framerates not changing at 1080p due to CPU limitations (though I'm using an i7-8700K for the testing), but I figured the 1060 3GB would show improvements at 1080p. It doesn't, but at 1440p and 4k it does.

I'm perplexed by the 1080p results, and I reached out to Nvidia for comment. So far the only word on how the improvements were achieved is "Christmas magic," which maybe means 1080p was naughty this year and only got coal in its stocking. I'll update this article if additional details or comments are provided.