Last week, we covered Ashes of the Singularity and how the game’s DirectX 12 performance has evolved between AMD and Nvidia. This week, Microsoft has launched the PC version of Gears of War Ultimate Edition, but the characteristics of the two titles couldn’t be more different. The new Gears of War is catastrophically broken on Radeon cards.

Jason Evangelho of Forbes has details on the exceedingly strange performance results, of which there are many. The Radeon Fury X is incapable of holding a steady frame rate, with multiple multi-second pauses throughout the benchmark run. The same problem struck the 4GB R9 380 and the 4GB R9 Nano as well. Meanwhile, an R7 370 — that’s a midrange card based on a four-year-old graphics architecture, which also ships with 4GB of RAM — runs just fine.

Here’s the R9 Nano running in 4K at High Quality.

The Forbes tests show two trends. First, GCN 1.0 cards perform smoothly, while GCN 1.1 and 1.2 cards stutter and struggle. Second, AMD GPUs with >4GB of RAM show marked improvement. I spoke to Jason about his results; he indicated this is not the case on Nvidia hardware, where 4GB of RAM gives the GTX 980 all the headroom it needs at settings and resolutions that cripple AMD. Last year, we surveyed 15 titles to determine whether gamers needed more than 4GB of VRAM to play in 4K and determined they did not. The fact that AMD is hammered at 1440p and High detail suggests that memory management in Gears of War Ultimate Edition is fundamentally broken as far as AMD GPUs are concerned.

One of the historical differences between AMD and Nvidia has been their Day 1 driver support. AMD has put a great deal of work into closing that gap in recent years, but Nvidia is still widely perceived to have an edge when it comes to launch-day optimizations.

In this case, however, the problems go far beyond performance profiling. The game isn’t slower on AMD — it’s unplayable on many AMD GPUs. Hawaii / GCN 1.1 is now more than two years old, Tonga is 18 months, and Fiji has been in-market for nine months. None of these are new products.

Developer or driver?

There are several reasons to suspect this is a developer issue rather than a driver problem. First, there’s the fact that DirectX 12 is designed to give developers far more power over how a game is rendered. This can be a double-edged sword. DX12 allows for better resource allocation, multi-threaded command buffers, asynchronous compute, and better performance tuning — but it also makes it harder for the IHV (that’s AMD or Nvidia) to optimize in-driver. There are optimizations that AMD and Nvidia could perform under DX11 that can’t be done in DX12.

Unlike Ashes of the Singularity or Fable Legends, Gears of War Ultimate Edition was never designed to be a DX12 — or even a DX11 — title. When Digital Foundry reviewed the game last August, it noted:

While Gears of War 4 is in development using Unreal Engine 4, Gears Ultimate instead opts for more familiar ground – the original 2006 source code. From the beginning, the Ultimate Edition was designed to capture the original experience as accurately as possible while updating its presentation for the current generation. More recent versions of Unreal Engine 3, and even UE4, were considered early in development, but the decision to stick with the original codebase was made in order to preserve the original simulation. (emphasis added)

Gears of War Ultimate Edition isn’t a new implementation of a classic game, it’s built on the same source code and engine as its 10-year-old predecessor. That means everything The Coalition did to bring the game into the modern age, like adding 4K support and higher-quality textures, was done with a version of the Unreal Engine that was barely out of diapers. Not even the latest version of UE3 supports DX12 — but Microsoft decided to stuff it into a decade-old title and shove it into the Windows Store. However they hacked the engine to implement DirectX 12, there’s no way that the 2006-era Unreal engine could ever be considered a good candidate for the process.

As this chart from our demystifying DirectX 12 article shows, there are differences between GCN 1.0 and 1.1’s support for specific DX12 features. If we had to guess, we’d guess that the performance differences and the massive GPU memory requirement on AMD hardware to ensure smooth performance is related to one of these variations.

Shades of Arkham Knight

The game runs so poorly on mid-to-high-end AMD hardware, it’s hard to believe The Coalition did any testing on AMD GPUs at all. It’s reminiscent of the Batman: Arkham Knight debacle. Granted, performance on Nvidia hardware seems largely unaffected in this case, but AMD cards still account for ~25% of the GPU install base according to Steam, and AMD has been far more aggressive than Nvidia when it comes to talking up and marketing DirectX 12. From a marketing standpoint, if nothing else, it’s been AMD taking the lead on low-overhead APIs, stretching back to Mantle’s debut.

Microsoft has made an official statement on the problem, telling Forbes:

“We are working closely with AMD to address a few issues that users of some AMD Radeon Hardware are experiencing while playing Gears of War: Ultimate Edition for Windows 10 and expect that they will be addressed quickly in an upcoming update.”

When we reviewed Ashes of the Singularity, we recommended that readers wait for additional data points on DirectX 12 before deciding which vendor held a performance advantage. Given this debacle of a debut, I’m doubling down on that. In its current state, this game is far too broken to serve as a performance test between AMD and Nvidia. Given the severity of the issues, I’m not even sure the Nvidia results should be considered representative.

As if everything we’ve already noted wasn’t enough, the game is a Windows Store title — which means it inherits all the limitations of that distribution method.

If watching the fallout from the Super Tuesday primaries isn’t masochistic enough for you and you plan to install this title, we recommend consulting the official forums for technical support. The developers recommend disabling ambient occlusion altogether on AMD cards, and state that G-Sync causes “significant performance issues.” According to Nvidia, G-Sync works perfectly well in-game, but cutscenes may not render properly with G-Sync enabled. An upcoming driver from Team Green will solve this issue.