Okay, I'll admit, I just like saying "foveated rendering" because it makes me sound smart. But it's a real thing, and it's going to probably play an important role in the next stage of this little VR journey we're all on. Foveated rendering saves on GPU work by only rendering sharply whatever you're looking at, while leaving the periphery blurry. Basically, it imitates what the eye already does in order to make rendering more efficient, doing less work where it isn't needed.

Of course, this only works if you know where someone is looking, and none of the major VR headsets currently available do this. However, there are eye-tracking solutions on the market. The high-end StarVR headset from Starbreeze, for instance, will include a Tobii sensor for eyeball tracking.

Well, there's another player in the game: SMI. SMI has integrated eye tracking in the Oculus Rift DK2, the Gear VR, and at Siggraph this year it's showing off an eye tracking developer's kit for the HTC Vive.

Once you know where someone is looking, how you foveate? (Lol, "foveate," such a great word). That's where Nvidia comes in. At Siggraph it's using SMI's sensors to show off a new and improved foveation technique that can blur the periphery of an image while still maintaining the parts that humans perceive at the edges of their vision — color, contrast, edges and motion. Nvidia claims this allows it to substantially reduce rendering "effort" over other foveated rendering techniques, with no discernible drop in quality.

Foveated rendering won't solve the biggest problem with current VR graphics: low resolution screens. But it might give GPUs a better chance of powering higher resolution screens when they do arrive.