257 SHARES Facebook Twitter Linkedin Reddit

Over the last 12 months or so we’ve been keeping an eye on Google’s aggressive hiring for its AR/VR team which, among other things, suggested that the company was working on new “mass production” hardware that would go beyond the simple smartphone snap-in Daydream headsets. This week we learned that there was indeed something to all that hiring, as Google has announced new ‘standalone’ headsets coming to the Daydream platform, fully self-contained VR devices, the first of which will come from HTC and Lenovo. Until those headsets launch later this year, Road to VR was among a select group that got to see one of Google’s early Daydream standalone prototypes.

At Google’s Mountain View campus, a stone’s throw away from the hustle and bustle of I/O 2017, I sat at a literal round-table with core members of Google’s AR/VR team. No pictures were allowed. There, I learned about the latest Daydream and Tango developments that have happened since last year’s I/O. Core to the discussion was the new ‘standalone’ VR headsets coming to Daydream.

And while Google has announced that HTC and Lenovo are working on consumer versions of the device, and that the company has build a reference version with Qualcomm (for other companies to use as a foundation to make their own), the device I got my head in was even earlier than that; an internal prototype of roughly a year old. Lots of caveats were given: ‘the displays are more than two years old, the latency still needs to be optimized, you might see some dropped frames due to the old hardware and software’, I was told. And yet I was still very impressed with what I saw.

In the same conference room were two large circular carpets, maybe 10 feet in diameter. At the center I was handed an entirely black headset with a ‘halo’-style head strap (like PSVR) which tightened with a knob in the back and rested comfortably on the head. I could see that the lenses were shaped similar to PSVR’s, except they used a Fresnel design like the Vive. I could also see a clear indication that these headsets had eye-tracking hardware on board, but Google’s AR/VR team didn’t talk about that feature and it isn’t clear yet if we can expect eye-tracking in the final products.

As I put the headset on and tightened the strap for comfort I found myself immersed in an underwater scene with translucent jellyfish floating about and a curious sea turtle who would swim by from time to time. Though Google hasn’t said much about the headset’s specs (and they are likely to differ for the consumer versions), the field of view looked close to what I’d expect from PSVR (~100 degrees diagonally), which is a huge improvement over the Daydream View’s more limited field of view.

Unlike prior Daydream headsets, which can only track rotation, the standalone Daydream headsets will use Google’s newly announced ‘WorldSense’ inside-out tracking to achieve positional tracking too, allowing you to walk around your environment just like you’d expect with the Vive, except with no external beacons for tracking.

I was able to comfortably roam the entire area of the circular carpet, getting up close to inspect the jelly fish floating around me. If I stepped toward the edge of the carpet, the virtual world would fade to black to let me know I was leaving the designated playspace. Theoretically WorldSense tracking could go much further than the carpet, but Google seems to be positioning it for now as a room-scale-capable system that you can use anywhere, rather than something you might roam your entire house with in one session. It isn’t clear yet if this is a design or technical limitation.

Though I only had my head inside this particular demo for five minutes or so, I was very impressed with the accuracy and latency of the tracking. At least for the relatively slow moving case of walking around and looking at things, it worked very well and felt much like I’d expect from a Vive. However, there were few static near-field objects in the demo with which to get up close and get a good assessment of any jitter, and it’ll take more time with the headset to see how it handles faster motions like ducking and dodging.

To achieve the inside-out tracking, Google tapped their Tango team to create a version of their optical inside-out tracking that’s optimized for VR, the result of which is WorldSense. On this prototype standalone Daydream headset, the tracking was achieved with two forward-facing cameras which are the only sensors—save for the usual on-board IMUs—that derive the positional information; Google confirmed the system doesn’t rely on a depth sensor.

To see how well the tracking could keep up, I tried covering of the two cameras on the front, and much to my surprise it still worked relatively well. Google said there’s a monoscopic mode which can kick in if occlusions like that were to happen, but generally the two cameras are working in tandem to do the tracking. I also tried looking straight down at the ground with my head just a foot or two above it (trying to give the cameras less distinct visuals to work with), and found that the tracking held up perfectly.

Google said the room was not pre-mapped and that this demo was specifically designed to wipe any WorldSense data so that I was experiencing the headset as if it was the first time it saw the room. That said, the company did confirm that WorldSense will learn over time and get even better if you use it in the same place repeatedly.

For a year-old device, the Daydream standalone VR headset prototype is very promising. Newer iterations in work by Google’s partners, which will use Android O (which is further optimized for VR), as well as newer and more powerful hardware (based on Qualcomm’s Snapdragon 835), are likely to be a significantly more ‘premium’ experience compared to Daydream headsets that rely on a snap-in smartphone, thanks to their made-for-VR optimization and positional tracking which adds an entire new dimension to the mobile VR experience.