No matter how fast your Internet connection is, streaming game services like OnLive and PlayStation Now always bump up against a hard latency limit based on the total round-trip time (RTT) it takes to send user input to a remote server and receive a frame of game data from that server. The hope for these systems is that broadband speeds and server connections will eventually improve enough so that trip is quick, to the point of being nearly unnoticeable for end users. Until then, a team at Microsoft research seems to have done an end run around the RTT latency limit, using predictive modeling to improve apparent performance even when the server trip takes a full quarter of a second.

Late last week, Microsoft released a paper detailing the development and testing of DeLorean, a system that uses a number of techniques to mask the inherent latency between the server running a streaming game and the user giving inputs at home. The main technique involves future input prediction: by analyzing previous inputs in a Markov chain, DeLorean tries to predict the most likely choices for the user's next input (or series of inputs) and then generates speculative frames that fit those inputs and sends them back to the user.

By the time those predicted frames get back to the user, the system can see which input was actually entered, then immediately show the appropriate predicted frame for that situation rather than waiting for another round-trip to the server. The DeLorean system also improves performance by "supersampling" inputs at a faster rate than the game normally does, and it applies a Kalman filter to reduce the shakiness of the predicted frames.

Even if the input doesn't match the prediction precisely, the DeLorean system uses a misprediction compensation technique to alter a predicted frame so that it matches what should actually be shown. By sending extra depth and rotational environment information with each frame, the local machine can immediately tweak the predicted image to match the actual inputs; showing a scene a bit to the left if the user turned farther left than expected, for instance.

The result is a streaming experience that's much closer to that of a game running on a local machine, according to testers. Even when the actual round-trip time between input and server response was 256 ms, double-blind testers reported both the gameplay responsiveness and graphical quality of the DeLorean system were comparable to a locally played version of the game. Conversely, tester opinions of a standard streaming client degraded very quickly as round-trip server travel time went up.

Testers using the DeLorean system also showed empirically better performance when playing Doom 3 using DeLorean rather than normal streaming, finishing the test level with more health and completing in-game tasks more quickly thanks to the predictive modeling.

The improvement in performance doesn't come for free; sending those extra predictive frames and information does add a bandwidth overhead of anywhere from 1.5 to four times that of a normal streaming game client, according to Microsoft Research (those numbers would be worse if not for compression owing to the similarity of most predicted frames). Getting the service to work also required special coding on top of the tested versions of Doom 3 and Fable 3, which were modified to support the new predictive streaming system.

Still, these are the kinds of outside-the-box improvements that streaming gaming services are going to need if we want them to be viable before we all have symmetrical gigabit connections running directly to gaming server farms. Microsoft's research into these solutions is especially interesting, considering that the company said last November that streaming gaming was "really cool and really problematic, all at the same time, insofar as it’s really super cool if you happen to have the world’s most awesome Internet connection." This doesn't necessarily mean we'll see game streaming on the Xbox One any time soon or anything, but it's still interesting that one arm of Microsoft is apparently thinking about how best to solve the latency problem.