



Almost everything we know about NVIDIA's next generation GeForce GPUs is based in rumors and speculation. That will change sometime this summer, and by August at the latest, as NVIDIA is scheduled to talk about its next-gen GPUs at the Hot Chips conference . In the meantime, the latest chatter is that NVIDIA's upcoming Turing GPUs will support real-time ray tracing at the hardware level, which could elevate gaming in a big way.





For anyone who is not familiar with ray tracing, it's a method for rendering graphics by which images are created by tracing rays or paths of light as they bounce in and around an object (or objects) in a scene. It's sort of the Holy Grail of graphics in gaming (and other applications), as it's capable of creating photorealistic imagery with correctly casted shadows and lighting effects that truly mimic the real world. The problem is that real-time ray tracing is a demanding technology, requiring levels of calculations beyond the capabilities of today's hardware.





Microsoft has figured out an end-around to this by baking ray tracing support into its recently introduced DirectX Raytracing (DXR) API , which is a natural extension of DirectX 12. Rather than replace today's rasterization and compute rendering techniques, it supplements them. Since it's not a full-on replacement, it can run on modern DX12 hardware.













NVIDIA's RTX technology is a "high-performance implementation that will power all ray tracing APIs supported by NVIDIA on Volta and future GPUs," the company announced earlier this year at GDC 2018. The company's Titan V and Quadro GV100 both support RTX, and by extension real-time ray tracing, and supposedly so will NVIDIA's upcoming Turing GPUs.





While obviously nothing is yet official, it makes sense that NVIDIA's next-generation GeForce cards will support ray tracing, because they will running on a new architecture, perhaps either Volta or a variant of Volta, and not Pascal like the company's existing cards. Ray tracing is likely to be the big 'gee-whiz' feature that NVIDIA promotes when it finally introduces its GeForce 1180 or 2080 (or whatever it ends up calling its next graphics card).





Turing is also said to usher in support for HDMI 2.1, since the spec has been finalized since last year. HDMI 2.1 brings additional features to consumers, including Dynamic HDR, variable refresh rates, and a bunch of resolutions scaling all the way up to 10K at 120Hz.





In addition, rumor has it NVIDIA's batch of GPUs for consumers will feature a new iteration of its GPU Boost algorithm, or a brand new GPU clock mechanism, to enable clockspeeds that are higher than today's cards. Also look for Turing cards to sport GDDR6 memory chips, also with higher clocks.





The bottom line is that exciting times are ahead, folks.

