Working in Virtual Reality (VR), and even more in WebVR (browser-based VR) is very exciting, because we are at the edge of a new technology where everything has to be invented. VR is a whole new medium, and as for the start of the mobile era a few years ago, we have to decide what will the new behaviors be.

VR User Experience/User Interface(UX/UI) is a big part of it. Some great designers in big companies are working on solutions, I encourage you to read this article from our UX/UI designer at Beloola which sums up some of those works : “5 lessons I learned from designing the VRUI of Beloola”

On the other hand, there’s the big question about the navigation system. The user is now totally immersed in the experience, and that’s why accessing the classic peripherals from a computer have become very difficult.

Optitrack motion capture (source)

As a consequence, we must rethink motion control from the user and find new controllers. Moreover, navigating in VR is very different from a classic navigation in a ‘flat’ 3D environment. Now we have to think 360° and engage the user’s behavior in all directions. An other aspect of VR is that one of its main problems is the motion sickness involved while using a Head Mounted Display (HMD).

In a summary of my experience as a WebVR developer, I’ll try to expose you how I perceived and dealt with these challenges.

New controllers in VR, the challenge for the best feedback in the end consumer’s hands

A new medium implies new controllers. The number of hardware companies involved — both on the HMD and on the controller sides — is increasing with the time. Every meeting or conference brings along newcomers trying to take a step up in terms of ease of use and movement detection.

From the use of a minimalist finger controller to the whole hands through full motion tracking systems, every system has its own advantages and product target.

Reverend Kyle VR Podcast (source)

But the problem may be that none of the biggest HMD constructor has announced any partnership with those companies yet (this article being written 2 days before Oculus Connect 2, I might be proven wrong). Oculus has announced that a Microsoft XBOX gamepad would be delivered with each HMD, as the Oculus Touch will probably be sold separately. The HTC Vive comes with its own motion controllers, and the other HMD constructors haven’t been very clear about it so far.

So the question that every VR creator asks himself is: which controller should I choose for my experience, and will the users be able to afford it, why this one instead of an other? And even if not every controller is fit for all experiences, developing for every ‘suitable’ controller would be really time consuming and the user experience would have to be different.

From a limited field of view to a total immersion

A user in front of his computer would have a field of view between 37° to 90°. When immersed in VR, the whole double PI sphere becomes available to its eyes, as it would be in real life.

VR is all around us (source)

From this perspective, sitting in front of your computer might become a problem. Looking behind you becomes a repeated gymnastic exercise that you weren’t expecting to do. And which is not very practical either.

The experience has to offer the possibility for the user to look around without being restrained by its ‘real’ environment. How many times have we seen people in VR get their cables stuck in their chair while turning around, or hit their front-head on the desk while trying to look down?

People also like to relax and feel comfortable on their chair/couch. We shouldn’t ask users to change positions in that kind of comfy situation every time they want to experiment VR.

The VR experience we are building has to consider all of those behaviors, and try to fit with them instead of trying to change habits. The user’s king after all ;)

Motion sickness in VR: the common enemy

The Holy Grail of experience everyone in VR is (or at least should) be looking for is motion-sickness-free experience. Motion sickness in VR is induced by 2 main factors :

Image refresh rate : if the refresh rate is lower than the brain process, then it will cause a discord making the user sick

: if the refresh rate is lower than the brain process, then it will cause a discord making the user sick Disparity between visual and vestibular stimuli: when the user is experiencing quick noncontinuous motions like running or jumping and his body stays still, the brain receives different stimuli and can’t compute them together

The first factor is related to the system used and how the program computes the data. It’s still very challenging to get to the minimum refreshing rate advised and it requires to work both on the software and the hardware sides.

The second factor is directly related to the subject of this article. Indeed, it is of our responsibility as experience makers to provide the more enjoyable one for the user.

I won’t put an image of motion sickness here, you all know what it is ;)

Possible pieces of solution

Considering all those factors, building a navigation solution for VR is really challenging. Moreover, WebVR has the great ability to be portable on many devices: computers with Oculus, smartphones with Cardboard and Samsung Gear VR, and probably more to come.

In order to develop a similar experience on each of those devices, the controller problem was the first coming up to us as an obstacle. The keyboard and mouse should be always removed from the equation has of the impossibility for the user to see where its hands are, and as it breaks the immersion. The XBOX gamepad controller is not easy to plug in with smartphones (if not impossible). And forcing the user into buying a specific controller for your experience is not something I recommend.

So, why not just get rid of all controllers? It might seem crazy, and I’m not saying that it should be the case for your whole experience, but it could allow a basic use of it without being bounced off by the fact that you don’t have the adapted controller.

Taking example on the obvious way of handling displacement used by Cloudhead games with the Vive (called Blink locomotion), and the UX designs from Josh Carpenter at Mozilla, we came up with a very simple way to move around without using any controller.

The ‘gaze and teleport’ system

The UI of the user is located at its feet, in order to not override the 3D displays. It’s placed at a fixed angle of 30° from the direction faced by the user, and is rotated with the avatar of the user if this one turns around more than 60°. This way it allows the user to stay in a comfortable position to access the UI.

The VR UI in Beloola is for now very simple and just allows to toggle the navigation. To do so, all the user has to do is to focus with the reticle on the interface button for a second until the reticle changes into a loading cursor, to indicate that an action will launch. Once the cursor is fully loaded, the icon image is changed to show that the action was completed.

When the navigation is turned on, the user can look around him and when he’s looking at the ground an arrow appears to show him the place he’s looking at, and so what his destination will be. If he focuses on the same location for a second, the loading cursor will appear again before teleporting the user to the location.

To teleport the user, and in order to avoid motion sickness, a black box will surround the user’s camera, blindfolding him and therefore preventing him from seeing the motion. Once hidden from the scene we can teleport the user to the new location and once there remove the blindfold.

Concerning the avatar’s orientation, and therefore what the user is looking at, we bound the space key on desktop and touch event on mobiles (available on cardboard 2 and Samsung Gear) to trigger a rotation of PI/8 degrees, so the user can turn around without the motion being too violent. This is the only external input we’re using, and it’s necessary for the user to enjoy the experience without physically moving too much. It could become eventually a button from the interface as well.

Demonstration of how the VR Navigation System works. This is an in-game capture with a lower FPS than the game due to the screen-caster software.

It’s very intuitive once you get how it works. #WIVR September 20th 2015 @ Linden Lab

With the arrival of VR and this new way to live an experience, we, as experience creators, will have to think of new ways to interact. As smartphones naturally brought the touch event, I really think that the ‘lookat’ event will be part of the main inputs in VR in the near future.

This navigation system we developed is far from perfect, and we still need many users feedback to get the right feel about it. But so far it allows us to have a similar experience on Oculus, Cardboard and Gear VR, and this experience will be easily portable on other HMDs as well.

We might add some controllers on top of it to allow more interactions with the environment. But at least this way our basic layer of interaction is assured without any external help, and that’s part of our will to enable the most people to access our experience, and bring VR to the masses.

Thanks for reading me, and go create some awesome experiences ;)

If you have any reaction or question about this articles, I’d be happy to discuss it with you in the comments below, or you can ping me on twitter @thomasbalou!