From gaming to big data, virtual reality gives us the chance to build and explore whole new worlds beyond the screen. As we developed demos and prototypes with the Oculus Rift internally, several UX insights sprung forth. Now that many of you have received your VR Developer Mounts, we thought we’d share:

1. Maintain a comfortable distance between the viewer and layered virtual objects. In the real world, we’re not used to objects being within two inches of our noses – in fact, it can be a jarring experience! Perhaps it’s the expectation of physical impact up close, the fear of approaching objects, or the need for personal space, but our testing found this to be a recurring (and jump-inducing) problem. There are workarounds for this, such as applying a force away from the viewport to repel nodes to a comfortable distance.

Plasma Ball VR stays at a comfortable distance while being controlled by your hands, using tracking data combined with raw image passthrough.

2. Keep ergonomics and the physical constraints of both technologies in mind. For instance, with the Oculus Rift, cords make it difficult to design a game where you turn 360 degrees. In general, it helps to constrain gameplay because well-executed limitations will ensure a smoother experience overall. When the Leap Motion Controller is mounted on a VR headset, it can see beyond your virtual field of view, but your hands will occasionally fall out of sensor range. Design interactions that won’t be disrupted (or will break and resume gracefully) when the user turns their head or drops their arms.

The open-sourced Leap Motion VR Intro uses interactions designed to work seamlessly when your hands are in view – from flying in space to playing with floating spheres.

3. As always, feedback is essential. By using an onscreen hand (rigged or otherwise,) your app can provide constant feedback and let users reach into virtual scenes with confidence. Even if some gestures don’t impact the data at all (e.g. a random hand wave), it’s still more satisfying to get some form of feedback, perhaps by moving surrounding objects just a bit – as if your hand created a light breeze.

The rigged hand is available for Unity from our V2 Skeletal Assets. Be sure to check out the full VR example gallery – we’ll have more starter resources for VR development soon!

4. Think about how to make 3D data accessible. VR gives us the ability to access and reach into vast amounts of 3D data in entirely new ways. But as we found with our early data experiments, clustering data in 3D space results in occlusion and makes it harder to skim. In part, head positioning helps us understand 3D data better because moving the head laterally assists our sense of depth perception, but there’s no substitute for experimentation when it comes to designing new ways to access content in three dimensions.

5. Design content consumption according to user goals. 3D data from a distance looks beautifully epic, but for it to be actually useful, the content needs to be consumed in various ways, depending on the use case. The user might have multiple goals – so break them down. For example, imagine a 3D graph that maps out how a tweet goes viral, spreading from person to person:

See the entire data cluster from far away. Fly around it to see its shape. Note an interesting node, which represents a user account. Zoom all the way into that node to see their feed appear. Zoom back out a little bit. Zoom into another node with a retweet connection to the first node.

Imagine how you would navigate and explore large data sets like this network visualization from Los Alamos National Laboratory.

Whether you’re building a virtual gameworld or a universe of data, user flow is a powerful concept to help you build a compelling and enjoyable experience. If you’re unfamiliar with user flows, Signal v. Noise has a cool article to help get you started.

What sorts of experiences are you developing in VR? Any tips for your fellow developers? Let us know in the comments!