TimeWarp, Spacewarp, Reprojection, Motion Smoothing. Asynchronous, Interleaved.

You may have heard these terms or seen them in the settings of your VR headset, but what do they do, and what’s the difference?

Timewarp

The idea of Timewarp has been around in VR research for decades, but the specific feature was added to the Oculus software in April 2014 by John Carmack. Carmack first wrote about the idea in early 2013, before even the Oculus DK1 had shipped.

Standard Timewarp in itself did not actually help with framerate, nor was it intended to. It was made to lower the perceived latency of VR. VR before the Oculus DK1 had much higher latency than today- mostly due to software rather than hardware. Timewarp is one of the multiple software techniques Oculus used to get latency low enough to not be noticeable.

Timewarp reprojects an already rendered frame just before sending it to the headset to account for the change in head rotation.

That is, it warps the image geometrically in the direction you rotated your head between the time the frame started and finished rendering. Since this takes a fraction of the time that re-rendering would and the frame is sent to the headset immediately after, the perceived latency is lower since the result is closer to what you should be seeing.

The concept of Timewarp is used today by all major VR platforms. So contrary to common belief, even when you’re hitting full framerate you’re still seeing reprojected frames.

Asynchronous Timewarp (ATW)

Asynchronous Timewarp takes the same concept of geometric warping and uses it to compensate for dropped frames. If the current frame doesn’t finish rendering in time, ATW reprojects the previous frame with the latest tracking data instead.

It is called “asynchronous” because it occurs in parallel to rendering rather than after it. The synthetic frame is ready before it’s known whether or not the real frame will finish rendering on time.



Diagram from Oculus.com

ATW was first shipped on Gear VR Innovator Edition back in late 2014. It was not available on PC however until the Rift consumer launch in March 2016. The feature’s reliance on hardware features addded in recent GPUs was one of the reasons the Rift doesn’t support GeForce 7-series cards or AMD cards predating the R9 series.

In October 2016, Valve added a similar feature to SteamVR, which they call Asynchronous Reprojection. The feature originally only supported NVIDIA GPUs, but in April 2017 support for AMD GPUs was added.

Interleaved Reprojection

Before the addition of Asynchronous Reprojection to SteamVR, Valve’s platform had Interleaved Reprojection (IR). Rather than being an always-on system like ATW, IR was automatically toggled on and off by the compositor.

When an app was consistently dropping multiple frames over a few seconds, IR forced the application to run at half framerate (45FPS) and then synthetically generated every second frame- hence “interleaved”. Interleaved Reprojection actually had some perceptual advantages over asynchronous reprojection as it makes any double image artifacts appear spatially consistently.

With the release of SteamVR Motion Smoothing in 2018, Interleaved Reprojection became obsolete.

ASW / Motion Smoothing

Timewarp (at current) and Reprojection only account for rotational tracking. They do not account for positional head movement, or for the movement of other objects in the scene.

In December 2016 Oculus released Asynchronous Spacewarp (ASW) to tackle this problem. ASW is essentially a fast extrapolation algorithm which uses the differences (ie. the motion) between the previous frames to estimate what the next frame should look like.

Despite the name, ASW is not always enabled. Like SteamVR’s Interleaved Reprojection from the past, ASW is automatically enabled when an app is consistently dropping multiple frames over a few seconds. It then forces the application to run at half framerate (45FPS) and synthetically generates every second frame.

Because of this, ASW doesn’t replace ATW. ATW is always active, ASW kicks in when needed.



Diagram from Oculus Connect 3

Because ASW only has the color information of the frame with no understanding of the depth of objects, there are often noticeable artifacts (imperfections) in the image.

In November 2018, Valve added a similar feature to SteamVR, which they call Motion Smoothing. The feature currently only works on NVIDIA GPUs, but Valve says AMD support “is coming”.

ASW 2.0

Asynchronous Spacewarp 2.0 is an upcoming update to ASW which greatly enhances the quality of the technique by incorporating understanding of depth. When announcing the technique, Oculus showed the following scenario as an example of the visual artifacts the 2.0 update will eliminate:

Unlike all the other techniques so far however, ASW 2.0 won’t work on just any app. The developer has to submit their depth buffer each frame, otherwise it will fall back to ASW 1.0.

Thankfully though both Unity and Unreal Engine, which together power the vast majority of VR apps, now submit depth by default when using their Oculus integrations.

Positional Timewarp (PTW)

PTW is an upcoming update to Asynchronous Timewarp (ATW) which will use the same depth buffer that ASW 2.0 uses to add high quality positional correction. Like ATW today, the PTW update will still be always enabled, so as soon as a frame drops a synthetic one is ready in time.

Facebook claims that PTW makes the transition of ASW enabling or disabling much more seamless, since there’s no longer positional judder beforehand. But just like ASW 2.0, PTW will only work for apps which submit their depth buffer.

It is believed that PTW will arrive in the same update as ASW 2.0, since ASW 2.0 will no longer account for headset movement, leaving this entirely to PTW.

Summary & Comparison

So put simply, here’s what each technique does for you:

Timewarp: lowers perceived latency

lowers perceived latency Asynchronous Timewarp/Reprojection (ATW): rotationally compensates for dropped frames

rotationally compensates for dropped frames Asynchronous Spacewarp (ASW) / Motion Smoothing: when framerate low, drops app to 45FPS and synthesizes every 2nd frame by extrapolating the motion of past frames

And here’s how each technique compares:

‘Auto toggled’ techniques are not always on. Instead, when the compositor notices that the framerate has been low for more than a few seconds, one of these modes is enabled. When enabled, the compositor forces the running application to render at half framerate (45 FPS for current headsets). The compositor generates every other frame synthetically, based on analysing the previous frames and incorporating the headset tracking data. When the GPU utilization is low again, the compositor disables the mode and returns the app to 90 FPS.