You know how when you put someone in VR initially that doesn’t understand the technology or know very much about it, one of the first things they always do is reach out with their hands and try to touch the digital world? VR headsets like the Oculus Rift, HTC Vive, Samsung Gear VR, and all of the others don’t track your hand movement at all and instead rely on peripherally handheld controllers instead, but it doesn’t stop our brain from reacting that way. The simulation is so believable that, as impressionable creatures, we want to reach out and touch things.

The ZED Mini, a recently released attachable AR camera (it’s still basically a devkit, this isn’t a consumer product just yet) essentially lets you do just that. By attaching the camera to the front of an Oculus Rift or HTC Vive it transforms your otherwise insulated VR headset that can only be used to transport you to other worlds into an AR headset that can be used to augment the world around you.

Here’s a video showing a few of the things that are possible right now:

The technology behind the device is impressive. The team at StereoLabs built the device in such a way that it mimics how our own two eyes perceive the world around us. Using both lenses on the front of the camera it records and displays 3D video in real-time to the headset’s display so it feels sort of like wearing a visor and less like an enclosed helmet. The 110 degree field of view is also dramatically bigger than other AR devices we’ve tried, such as the Microsoft HoloLens’ paltry 35 degrees. It’s basically on par with the FOV found in the Rift and Vive already.

And since it uses the external camera for inside-out tracking (as opposed to the outside-in tracking that the Rift and Vive usually really on) you’re free to walk around, jump, crouch, and do pretty much whatever you want without worrying about whether or not your tracking sensors can see you.

I got the chance to check out the ZED Mini in action in San Francisco a couple of weeks ago and came away very impressed. I put on the properly equipped Oculus Rift just like I would usually, but this time instead of standing at the Oculus Home interface it was like having a form of x-ray vision that let me see through the headset using the ZED Mini camera.

There was a very minor, almost unnoticeable, delay in terms of how I moved and what I saw, as in a matter of milliseconds, but I got used to it after a few minutes.

I tried out three different demos. The first one, which is shown in the GIF above and featured at the start of the video at the top of the article, was a solar system simulation. They gave me a PlayStation Navigation controller (it wasn’t being tracked or anything, just for button inputs) and mapped a few commands to the buttons. Using it, I could resize the planets, move the system around, and then of course physically walk around the room to experience it.

What’s most remarkable about this particular demo is the dynamic lighting and occlusion. If my hand got close to the sun, for example, I’d see the light shining on my hand and for a split second my brain told my palm it should feel warmer even though it didn’t. And then the 3D depth sensing was so accurate I could move my hands all around the planets just as you see in the GIF above. If I went too fast or got too close to an object there was some clipping, but it certainly worked way better than I expected.

The next two demos featured Face Raiders-style flying droids. They’d spawn into the game world and start shooting at me so I could either block the lasers with my hands — something that alluded back to my opening comments about people trying to reach out and touch digital worlds while wearing a headset — or I could dodge them. Hiding behind a wall or a couch, for example, provided cover because the headset identified objects in my environment. Or I could just shoot them with my own laser too if I got bored.

My final demo was similar to he previous one, but this time the flying droids would dash at my head instead. And instead of shooting lasers, SteroLabs was showing off the hardware’s ability to recognize and augment objects in real-time. I was holding a physical lightsaber prop in my hands — as in basically a toy that looked like a lightsaber — and inside the headset it started to glow and light up with force-like energy. When the robots dashed at me I could swing the lightsaber to send them flying against the wall before they exploded into a dozen pieces. You can watch some footage of this in action in the second video near the top of this article.

Throughout all of the demos the ZED Mini never got confused or lost tracking or needed any sort of calibration refreshments. It even recognized the other people in the room, each of which I could use for additional cover if I wanted. That isn’t to say it’s perfect though; the visual clarity is obviously much lower than real life and the field of view, while better than other AR devices we’ve seen, was still lower than the native size of the lenses in the VR headsets. I could see some black trim along the edges, for example.

But all things considered, the ZED Mini is an impressive little kit that should help democratize AR adoption and development, particularly among existing VR users.

The ZED Mini isn’t the only device doing this sort of thing, but it is one of the only ones currently at market that can leverage existing hardware. The Intel Alloy, for example, does have similar goals but its hand-recognition feature has an odd ghosting effect and no one knows what the status of all that is exactly. HoloLens is way too expensive and is still a dev kit as well, so the ZED Mini may certainly find a market for itself among existing Rift and Vive users.

You can purchase a ZED Mini right now from StereoLabs for $449. As explained, this is still very much a dev kit phase and is not intended for a consumer launch just yet. Let us know what you think down in the comments below!