Advanced Micro Devices said that architectural peculiarities of the graphics core next (GCN) architecture make it very efficient for rendering virtual reality games. Apparently, asynchronous compute engines (ACEs) of AMD Radeon graphics processing units provide unique capabilities for low-latency VR rendering.

“It has been something of a well-kept secret that the GCN Architecture is a low-latency powerhouse, which is ideal for VR,” said Robert Hallock, global technical marketing lead at AMD, in an interview with Wccftech. “Contemporary AMD Radeon hardware already supports low-latency VR rendering through ‘asynchronous timewarp.’ Asynchronous timewarp is a technique that can be exposed in AMD Radeon hardware via the Asynchronous Compute Engines (ACE), which can schedule and execute compute and display operations independently of the graphics queue.”

Modern AMD Radeon graphics processors – Hawaii (Radeon R9 290) and Tonga (Radeon R9 285) – feature eight ACE units, whereas AMD’s previous-generation Tahiti GPU (Radeon R9 280) sports only two ACE units, which means that those planning to use graphics cards for VR gaming eventually should prefer the latest Radeon graphics products.

“The ACE is a fundamental architectural building block of AMD Radeon GPUs utilizing the graphics core next architecture,” said Mr. Hallock. “We have also been demonstrating Alien: Isolation, an AMD Gaming Evolved title, on the Oculus Rift DK2 at select shows and events. It runs like a dream, and people have loved it!”

Theoretically AMD Radeon may have a serious benefit over competing GeForce graphics cards. Still, it remains to be seen whether architectural peculiarities of AMD Radeon graphics processors translate into actual performance benefits in real games when compared to Nvidia GeForce hardware.

Discuss on our Facebook page, HERE.

KitGuru Says: It will be extremely interesting to see real life advantages of AMD’s hardware versus the competition. While AMD’s latest hardware may be great, game developers utilize far not all architectural peculiarities of GPUs, unfortunately…

Become a Patron!