523 SHARES Facebook Twitter Linkedin Reddit

AT&T, one of the world’s largest telecom companies, this week announced plans to demonstrate cloud-rendered VR content streaming over a 5G network. The company says its 5G Lab, which is exploring use-cases for 5G and edge networking, will host the demonstration next month which will show SteamVR content rendered in the cloud and sent over a 5G network with low enough latency for a full 6DOF VR experience.

Next-gen ‘5G’ connectivity technology aims to deliver a leap in bandwidth and latency compared to existing mobile connections and a majority of in-home internet connections. While the increased bandwidth stands to enhance static content like 360 video streaming to VR headsets, the addition of ultra-low latency could potentially open the door to fully interactive VR content that’s streamed to a headset from the cloud instead of being locally rendered on a powerful PC.

To that end, AT&T says it has developed a proof-of-concept demonstration that renders SteamVR content in the cloud and delivers it to a VR headset quickly enough to stay within the critical latency thresholds necessary for a visually comfortable VR experience.

The company says that the demonstration will use a “5G 39GHz mmwave radio connected to a GPU-accelerated gaming server,” and will also rely on the use of ‘edge computing’ (ensuring that the VR content is rendered and server from a datacenter as physically near to the user as possible to minimize latency).

While cloud-rendered VR content has been discussed at length for its many potential upsides (mainly: lowering the barrier to entry for high-end VR content), there’s still many pieces of the pipeline that need to come together to make it work well and reliably, as AT&T notes:

Separating the [render] server and [VR] display by a wireless network means introducing new confounding factors into the pipeline, such as encoding and decoding delays, transmission delay, packet loss and jitter. This is especially challenging because networks and media streaming protocols weren’t optimized for real-time, interactive content. These experiences have different requirements than static, file-based media, so we can’t treat them the same way. Plus, the content capturing, rendering and display processes in 3D gaming engines were not originally designed to be hosted in the cloud. In order to democratize access to 3D experiences, we need to merge elements of the two models and redesign the process from the ground-up. That means in addition to optimizing the performance of our network, we must work with our technology partners to reimagine and re-architect how these applications are designed and implemented.

AT&T is setting the bar somewhat low for this initial proof of concept, saying that it targets a “3K resolution” (which we expect to mean total, rather than per-eye) and a 75Hz refresh rate, which is in line with today’s mobile VR headset specs, but questions remain about the future scalability of such technology to higher resolutions and higher frame rates for future headsets.

Still, there’s been a lot of buzz about VR cloud rendering, but so far we haven’t seen any complete and compelling demonstrations of the entire pipeline in action. AT&T’s upcoming proof of concept could definitely show that VR cloud rendering over 5G is a viable pathway to high-end VR—or reinforce skeptics who say it won’t happen for a variety of reasons.