Project Alloy from Intel is a prototype VR headset with important new features from one of the world’s most influential tech companies, and we’ve just tried the first hands-on demo of the hardware at CES.

Alloy sits in the same standalone category as the Santa Cruz prototype from Facebook’s Oculus, meaning the hardware you wear on your head includes not just the display, but the rendering and positional tracking technologies that are fundamental parts of making VR work. Unlike the Oculus Rift and HTC Vive, no outside hardware, sensors, or cameras are needed.

Developer kits for Alloy are already in the hands of Intel’s partners and the company expects it to be turned into a consumer product by the end of the year. It is a bit of a heated race for Intel, because Microsoft already announced partners working with the tracking technology it developed for HoloLens. This critical technology is a prize for Microsoft, developed over a number of years, and both Facebook and Google (along with many others) are racing to match it. Heading into an era of mixed reality, if Intel is to retain its position as a supplier of fundamental technology for a wide range of manufacturers, it needs Project Alloy and its tracking technology to be a solid platform upon which partners can build.

So how did it work? I am one of the only people in the world to have tried both Facebook’s prototype and Intel’s, so I have some perspective others don’t. That said, my time in each headset was extremely limited, the prototype is in ongoing development, and my impressions are totally subjective. So keep that in mind as you read on.

Intel Merged Reality

I can’t make too many conclusive statements about Intel’s technology, except to say that it won’t deliver an experience that feels anything like the one depicted in the “merged reality” video below anytime soon.

Instead, the Project Alloy demo I experienced “drifted” considerably. If it had been me wearing the Project Alloy prototype in the video above, I would’ve walked into a door.

Intel said it has the technology to scan a room while the headset is on, but in my demo it had been scanned beforehand. This scanning process should allow software to dynamically mold itself to the physical geometry of the room.

[gfycat data_id=”GiantMarvelousGalapagosdove”]

For my demo, a physical table in the center of the room became some kind of a glowing energy portal in the world I saw inside the headset, while the furniture around the perimeter became crates. The walls were gone and the environment turned into a platform in the desert, similar to the VR game Hover Junkers, and I was free to walk around the platform.

I don’t recall a physical object in the middle of the room in the Santa Cruz demo, so there is no point of comparison here, but in the Project Alloy demo this object’s location drifted considerably as I moved around the room. What this means is that the relative location of the table and its virtual counterpart “drifted” apart. Result: I bumped into the table. I could reach down and touch it with my hands, but the spot where my sense of touch is telling me the object is turned out to be about a foot off from what my eyes were telling me. Intel suggested the drift might have been caused by the number of people in the room.

The controller I held in my hand featured limited tracking, on par with the Daydream controller with only three degrees of freedom. I could move it left, right, up and down, but if I wanted to move it back and forth that movement wasn’t reflected in the virtual world. However, unlike Daydream, my head could move back and forth. This means that in the game I tried, airborne attackers come and you need to point the controller at the invaders and pull the trigger to shoot them.

Additional ammo drops arrived when I ran out, and I should have been able to reach out and grab it, except the controller can’t do that. Instead, I had to walk closer to the ammo to grab it, which wasn’t very intuitive. Intel said it is offering a separate demo at CES with full tracking freedom on the controllers, but we haven’t had the chance to try it yet.

The weight of the headset was well balanced with battery in back and processing up front, with no pressure felt on the front of my face. I could feel the heat coming off the headset by hovering my hand about half an inch above its surface in the front. After a very short session in VR, my face was sweating even though I wasn’t very active. Intel attributed it to their placeholder facial interface, which didn’t let air breathe around my face.

Facebook’s Santa Cruz versus Intel’s Alloy

Santa Cruz only had a small noticeable tracking hiccup during my demo, though spotting drift was difficult without the clear point of reference seen in the Intel demo. Overall, I found myself being pretty timid in Alloy for fear of bumping into something while I walked confidently from one side of the room to the other in Santa Cruz because the room was empty.

Intel says the technology will get lighter and improve in pretty much every aspect on its path to becoming a consumer product, with “hours” of battery life being the target. We’ll track down more demos to see the other pieces of Intel’s VR-related technology, and we will follow up with a longer video as soon as possible, but I will frankly be (pleasantly) surprised if it all comes together in a consumer product this year.