The heated debate over Nvidia's GameWorks developer program is still on full throttle. Yesterday AMD's Richard Huddy went on the offensive in an interview with PCR. Huddy, AMD's Chief Gaming Scientist, had more than a few things to say about the program from the competition.

Four days ago we published an exclusive four thousand word report about the situation with Nvidia's GameWorks program and many of its intricacies that covers both the Nvidia and AMD sides of the story which you should definitely go read before you proceed.

Going back to the issue of the day, when asked about Nvidia's GameWorks program Huddy had this to say :

"If it was just that, then people could say: I’ll take my choice and turn it off if I’m with AMD and leave it on if I’m with Nvidia. But I think it’s more negative than that – and I’ll point to two facts here,"

"Number one: Nvidia Gameworks typically damages the performance on Nvidia hardware as well, which is a bit tragic really. It certainly feels like it’s about reducing the performance, even on high-end graphics cards, so that people have to buy something new. "That’s the consequence of it, whether it’s intended or not - and I guess I can’t read anyone’s minds so I can’t tell you what their intention is. But the consequence of it is it brings PCs to their knees when it’s unnecessary. And if you look at Crysis 2 in particular, you see that they’re tessellating water that’s not visible to millions of triangles every frame, and they’re tessellating blocks of concrete – essentially large rectangular objects – and generating millions of triangles per frame which are useless."

Huddy is referring to the over-tessellation that was discovered by TechReport.com in Crysis 2 a few years ago. Where invisible bodies of water and blocks of concrete were needlessly subjected to huge amounts of tessellation that did not contribute to the visual quality of the game but negatively impacted performance.

Here's what TechReport's Scott Wasson had to say about it at the time.



"That's right. The tessellated water mesh remains in the scene, apparently ebbing and flowing beneath the land throughout, even though it's not visible. The GPU is doing the work of creating the mesh, despite the fact that the water will be completely occluded by other objects in the final, rendered frame.

That's true here, and we've found that it's also the case in other outdoor areas of the game with a coastline nearby.

Obviously, that's quite a bit needless of GPU geometry processing load. We'd have expected the game engine to include a simple optimization that would set a boundary for the water at or near the coastline, so the GPU isn't doing this tessellation work unnecessarily."











"Yes, folks, this is some truly inspiring geometric detail, well beyond what one might expect to see in an object that could easily be constructed from a few hundred polygons. This model may well be the most complex representation of a concrete traffic barrier ever used in any video game, movie, or any other computer graphics-related enterprise.

The question is: Why?

Why did Crytek decide to tessellate the heck out of this object that has no apparent need for it?

Yes, there are some rounded corners that require a little bit of polygon detail, but recall that the DX9 version of the same object without any tessellation at all appears to have the exact same contours. The only difference is those little metal "handles" along the top surface. Yet the flat interior surfaces of this concrete slab, which could be represented with just a handful of large triangles, are instead subdivided into thousands of tiny polygons."

The TechReport offered this in their conclusion "One potential answer is developer laziness or lack of time. We already know the history here, with the delay of the DX11 upgrade and the half-baked nature of the initial PC release of this game. We've heard whispers that pressure from the game's publisher, EA, forced Crytek to release this game before the PC version was truly ready. If true, we could easily see the time and budget left to add PC-exclusive DX11 features after the fact being rather limited.

There is another possible explanation. Let's connect the dots on that one. As you may know, the two major GPU vendors tend to identify the most promising upcoming PC games and partner up with the publishers and developers of those games in various ways, including offering engineering support and striking co-marketing agreements. As a very high-profile title, Crysis 2 has gotten lots of support from Nvidia in various forms. In and of itself, such support is generally a good thing for PC gaming. In fact, we doubt the DX11 patch for this game would even exist without Nvidia's urging. We know for a fact that folks at Nvidia were disappointed about how the initial Crysis 2 release played out, just as many PC gamers were. The trouble comes when, as sometimes happens, the game developer and GPU maker conspire to add a little special sauce to a game in a way that doesn't benefit the larger PC gaming community. There is precedent for this sort of thing in the DX11 era. Both the Unigine Heaven demo and Tom Clancy's HAWX 2 cranked up the polygon counts in questionable ways that seemed to inflate the geometry processing load without providing a proportionate increase in visual quality."

Back to the PCR interview with Huddy,

"Now, bringing down AMD’s performance is pretty dodgy, but when they bring down their own consumers’ performance, then it makes you wonder what they’re up to. Their QA must be appalling if it’s a mistake, and if it’s not a mistake, it makes you wonder what their motivation must be. So I think it’s very unhelpful for the business."

Here Huddy is likely referring to Nvidia's HairWorks effect, as most other GameWorks features aren't actually too demanding to run. Nvidia's HBAO+ ambient occlusion feature for example, while slower on AMD, only costs a frame or two of performance. HairWorks on the other hand - as we've detailed in our previously mentioned exclusive report - is significantly slower on AMD than on Nvidia. We found no obvious reason for this disparity other than the closed nature of the HairWorks code making it inaccessible to AMD and thus quite difficult to optimize.

"If you look at the way the performance metrics come out, it’s damaging to both Nvidia’s consumers and ours, though I guess they choose it because it’s most damaging to ours. That’s my guess."

With regards to the performance on Nvidia GPUs, we've seen a pattern emerge with a couple of GameWorks enabled games now like The Witcher 3 and Project Cars where older Nvidia GPUs - GTX 700 series and older - would show uncharacteristically poor performance. These GameWorks enabled games performed significantly worse on the older Nvidia cards for no obvious reason other than the inclusion of features from the Nvidia GameWorks library in these games.

"[Nvidia] don’t seem to care what the impact of GameWorks has on games either. If you look through the Metacritic scores of the games that Nvidia works with, they’re often quite damaged by the Gameworks inclusion, or at least the games themselves don’t score as well as you’d hope." "So I think it’s unhealthy for PC gaming. And I wish they would go back to the way everyone else develops their SDKs – give it a source code, let the games developer work with it as they see fit, and let us take the industry as a whole forward. That would be a better place to play."

I'm going to end this with a quote from my four thousand word report about GameWorks because I think it sums up the debate fairly well.