Headset specs Headset weight 555 grams (~1.2 lbs) without cables Display 2160x1200 (1080x1200 per eye) AMOLED panels Refresh rate 90 Hz Field of view 110 degrees Lens spacing 60.2-74.5mm (adjustable) Controllers Two wireless motion-tracked controllers with rechargeable 960mAh batteries Tracking SteamVR 1.0 tracking system with two "Lighthouse" IR laser tracking boxes (up to 5m diagonal tracking volume) Audio Audio extension dongle to plug generic headphones to headset. Built-in microphone PC connection Three-part multi-cable (HDMI, USB, and power) with junction box for PC connection. Included games Job Simulator, Fantastic Contraption and Tiltbrush Price $800

Recommended PC specs GPU NVIDIA GTX 970 / AMD R9 290 equivalent or greater CPU Intel i5-4590 / AMD FX 8350 equivalent or greater RAM 4GB OS Windows 7 SP1 or newer Outputs 1x HDMI 1.4 or DisplayPort 1.2; 1x USB 2.0 Other At least 1.5m x 2m of open space for "room-scale" experiences.

An entire generation of nerds has now grown up with the sci-fi ideal of the holodeck as the ultimate future of interactive entertainment. The Star Trek universe’s 24th century gave us a view of rooms literally filled with 3D holographic projections that users could touch, feel, smell, and talk to at will. As a way of interacting with a computer simulation, it seems believably hundreds of years beyond the current methods of using a mouse or a finger to dither around on a 2D screen.

We’re still a long way from technology that can suspend visible light (much less physical matter) in empty space as the fictional holodeck can. For now, though, the HTC Vive is a better simulation of key parts of that holodeck ideal than we had any right to expect from the early 21st century. By combining a 3D virtual reality display, position- and motion-sensitive handheld controllers, and a tracking solution that works over the scale of an entire room, the Vive transports you to a convincing simulated world that you can see and touch (even if you can’t convincingly feel it).

The characters in Star Trek didn’t have to deal with uncomfortable, slightly pixellated ski-goggle helmets or mounting tracking boxes around their living space. Still, the Vive’s ability to let you walk around and poke at a computer simulation as if it was a physical space feels like the first step toward a computing future that science fiction has spent decades training us for.

Motion control’s promise fulfilled

Of course, the idea of moving around and waving your hands to control a game or computer isn’t exactly new in consumer technology. Since the Wii became a runaway hit nearly a decade ago, in fact, the idea actually feels more like a fad whose time has come and gone. Through the Vive, though, HTC and Valve have largely fulfilled the unfulfilled promise of those early motion control experiments.

Previous mass market motion controllers all have important technical limitations that kept them from being too convincing or useful. Nintendo’s Wii Remotes aren’t precise enough for much more than undifferentiated shaking. Microsoft’s Kinect body tracking is still a bit too rough and grainy even in its second generation, and it suffers from a noticeable delay between movement and on-screen reaction. Sony’s hand-tracking PlayStation Move controllers are more precise, but these force users to stay in a smallish area facing a TV-mounted camera.

The Vive’s Lighthouse tracker convincingly solves all of these technical issues. By triangulating with infrared lasers from boxes in opposite corners of the room, the Vive headset and controllers can report their relative positions and angles precisely and quickly in a room spanning up to 15 feet diagonally.

It’s still not a perfect solution. I ran into some surprising moments where one or both of the Lighthouse tracking boxes would simply turn off for no apparent reason in the middle of a session (perhaps a victim of an overzealous Bluetooth auto-shutoff option). Other times a controller would disappear or begin to virtually float away from me at random. At times, the headset would lose tracking in the middle of the room, clouding my vision in gray for a few seconds.

The glitches are unfortunate, because 99 percent of the time the effect is nothing short of magic. By tracking your head, the two 1080x1200 pixel, 90 Hz panels on your face become a convincing, three-dimensional portal into a fully navigable virtual volume, showing you the appropriate view of the space wherever you stand or look. Having those pixels so close to your face means there’s some expected fuzziness to the simulated image, especially if the underlying game values crowded, highly detailed scenes over general sharpness.

Still, the effect is convincing enough to make you flinch away when an object comes close to your face. You feel a real sense of terror when you whirl around to see an encroaching zombie practically on top of you or a real sense of vertigo looking down from a simulated mountain peak. There’s technically nothing stopping you from stepping out and hovering in the “open air” in that mountain scenario, but good luck convincing your brain it’s safe to do so.

Say hello to your hands

























Being convincingly inside this volumetric 3D world solves the other big problem with previous motion control solutions: the abstraction between your hands and the TV screen. Even in first-person games, using motion controls on a flat panel was like controlling a disembodied puppet on a stage via a complex remote control. The Vive’s virtual reality, on the other hand, allows you to see the controllers perfectly positioned in virtual space, right where you’d expect to see them if you didn’t have a VR helmet blocking your view of the real world.

It’s hard to overstate how much of a difference this makes to the usefulness of accurate, trackable motion controllers. On the Wii, you just wiggle the remote and watch as a Mii character swings a racket at the same time. In a Vive game like #SelfieTennis, you actually see a ball arcing toward you, take a step or two to position yourself, then follow the ball as it makes contact with the virtual racket you've positioned to meet its oncoming trajectory precisely.

There’s still a level of abstraction with the Vive controllers, however, because keeping a hold on them means keeping your hand wrapped around the light but sturdy plastic constantly. This means you can’t really comfortably point, flex your fingers, or make gestures like an open palm in Vive space as you would with an unencumbered hand.

Instead, you often use the Vive’s limited set of buttons to simulate the natural motion of your fingers and palms. You might squeeze a pleasantly squishy analog index finger trigger to activate a “grabber” on the end of a stick or squeeze two “grip buttons” between your fingers and palm to pick up a virtual object. A large, clickable thumbpad allows for more fine-grained controls in floating menus, or it can simply act as a “thumbs up” detector. All in all, using the Vive controller feels like holding a tiny, prosthetic hand in your own hand and then using that fake hand to poke and grab at the virtual world without really feeling it (subtle controller vibrations notwithstanding).















Even with this remaining abstraction, it’s amazing how much being able to use your hands more or less directly opens up an entirely new language of interaction, far beyond the limited buttons, joysticks, keyboards, and mice we’ve been stuck with for decades. Instead of following a prompt to know which button pulls an in-game lever, for instance, you just know intuitively how to reach out and pull it.

No one has to tell you how to reach out and tap a button on a virtual computer console. Or how to pick up a virtual hot dog and put it in a virtual bun. Or how to point and aim a virtual gun quickly and accurately with a small twist of the wrist. Or how to throw a virtual knife at a virtual killer robot. Or how to grab both ends of a virtual inflatable support beam and stretch it like taffy by pulling your hands apart.

The controllers in your hand can become a bow and arrow, a golf club, a tennis racket, a sword, a flashlight, or a shield to block incoming attacks with ease. Other games go a little more abstract, transforming your controller into a paintbrush that can suspend color in mid-air or a laser pointer that traces out a landing path for planes. Maybe you’ll hold a spaceship and navigate it through a hail of bullets like a child with a model plane or grab a moon and hurl it into orbit around a planet with a realistic gravity well.

All these things aren’t just already possible, but they're intuitive on the Vive. Actions in this world are imbued with an immediacy and direct impact that just can’t be matched by any other controller I’ve ever seen or used. There’s a level of fine-grained, instantaneous, and natural control that you start taking for granted after spending a bit of time in a Vive simulation, and it's a feeling you start to miss when you go back to using any other controller.

But the main benefit of transitioning from buttons to hands is the lifetime of knowledge you already have regarding how your arms and hands work. Even if you’re not actively looking at the controllers in virtual space, you intuitively know where they are, how far they can reach, and how quickly you can move and twist them. Frankly, using your hands as hands is a control solution you’ve been training for throughout your entire life.