True hand-based input for VR will unlock new mechanics for VR developers and creators alike. Hand tracking on Quest will let people be more expressive in VR and connect on a deeper level in social experiences. Not only will the current community of VR enthusiasts and early adopters benefit from more natural forms of interaction, hand tracking on Quest will also reduce the barriers of entry to VR for people who may not be familiar or comfortable with gaming controllers. Even better, your hands are always with you and always on—you don’t have to grab a controller, keep it charged, or pair it with the headset to jump into VR. From entertainment use cases to education and enterprise, the possibilities are massive.

At OC6, we demonstrated this new technology, which will launch on Quest in early 2020 as an experimental feature for consumers and an SDK for developers. That means that, starting early next year, VR devs will be able to build and ship experiences that let you use your own hands in VR without controllers or other peripheral devices—and the Quest community can get an early taste of what’s in store by opting in to hand tracking early next year.

What began as a research project at Facebook Reality Labs has been brought to life through close collaboration with our product and design teams to bring about a new paradigm for VR input. Our computer vision team developed a new method of using deep learning to understand the position of your fingers using just the monochrome cameras on Quest today—no active depth-sensing cameras, additional sensors, or extra processors required. This technology approximates the shape of your hand and creates a set of 3D points to accurately represent your hand and finger movement in VR. Click here to learn more about the AI magic under the hood.

We look forward to sharing more in the months ahead.