

The world is becoming crazier and more exciting as Virtual Realities are becoming more real and the lines between our physical world and virtual word are fast becoming blurred. Next year 2016 will be an important year as companies are finally making virtual reality real with only 2 main players participating on the race: HTC/Valve, Facebook/Oculus will be making VR products available. Microsoft is partnering with both Valve VR and Oculus while Sony is pushing hard not to miss the bandwagon with Morpheus.



The user behavior of VR has yet to come to shape and will be evolving for the next 5 years, the idea of information conveyance is interesting and playing a special role in the realm of virtual and augmented reality. We are learning to design cues in the VR world to tell users where to move or go or to touch. This is still a challenge as don’t think about it that much in the physical world. We walk into a room assuming the chairs are moveable and the cabinets are not. That sense of reality has changed in the VR world. Anything can happen to everything. It is like living in a dream.Those who have experienced the HTC/Valve Vive were completely fascinated by the experience. HTC Vive is clearly ahead of the game.



For VR, unlike 2D gaming or in the real world, we have arrows and signage to tell people where they can go. In the VR world when user is wearing a head-mounted display, the users are actually operating in two realities – the real, knowing that the user is wearing these gears and the virtual, which is so immersive (more so for the HTC/Vive roomscale technology that allows you to walk around the room) and therefore difficult to find a frame of reference.

The question is do we want user to have the feeling of wearing a space suit or simply feeling they are acting as normal with no gears on them? The awareness of wearing a suit will remind people they are in a VR world unless the suit is a normal suit. While body motion sensing is great when body movements are being interpreted by motion sensors as perceived input and turns it into commands and controls. Basically, with this technology, you are the controller.



KOR-FX is making an interesting gaming VR vest that uses acoustic feedback that, in turn, is processed into haptic feedback. Its software analyzes the feedback felt by the user, depending on what’s going on with the on-screen and what the user is doing it with. The controller “acousto-haptics” is to define acoustic feedback that is being felt.

Sixense is another company currently developing a wireless solution to hand input in virtual reality with the STEM System making use of alternating current (A/C) electromagnetic field to accurately track body movements in terms of position and orientation within a radius of a stationary base.



In designing VR experience, the haptic side of things is perhaps the most difficult to address. Oculus has yet to demonstrate an “official” control mechanism for the Rift, instead letting each developer use the peripherals they find most suitable. I am speculating they may end up figure out a visual hand-tracking technology from a front mounted camera. So what is the best way to navigate in the VR world? Input is still a key challenge for VR as a pure look-based movement (turning your heads) supported with some analog input (mouse in your hand as an example) but sometimes the head is looking at a different direction and the hands are not in sync. This can causes problems as if the system is receiving two very different commands.

Expanding information conveyance is still early in the usability learning curve and we are a few years away before arriving at a dominant model. There are still many creative options we can explore and they are not mutually exclusive either. A voice based system guiding the user as an example. Once we start adding social interaction in shared virtual environments, it will explode. Social VR means social presence and this is a $20 billion dollar business at a minimal.



The inevitable evolution of inputs in some form of controllers will keep coming. Xbox and Playstation will need to think how to migrate their current controllers to take advantage of the new requirements (not easy) and we can expect to see more innovation from start-ups including companies such as Razer Hydra from Sixense, or camera-based systems which are limited by the camera’s line of sight. HTC/Valve is also working on some cool solutions aiming to provide the best and not just the first.

Immersive VR systems that alter your sense of physical reality can give user the illusion that you are located inside an unreal virtual environment – or a telepresence - the ‘sense of being there’ or ‘feeling of being there’. In the early 1990s this idea was transplanted to virtual reality, where instead of being at the remote physical environment, the participant was in a virtual environment with a sense of being at the place depicted by the virtual displays.



A Swiss healthcare hybrid MindMaze announced its pioneering move into the VR and gaming spaces with MindLeap. Combining headset-mounted neural sensors and motion-capture cameras, the system is designed to “facilitate neuro-powered immersive virtual and augmented reality” on consoles platforms as XBOX, PlayStation and Android. This is still in development and I am not sure if it can be done soon. They leverage technology developed by the company for a range of medical applications in treating neurological deficits, such as helping amputees control robotic limbs. Neurologial control is still a bit of science fiction and will be a long way before it can be mass commercialized. I like to dial my HTC M9 with my brain some day.

For now, we will be sticking with head-mounted and handheld controllers. They are enough for us to re-create boundaries and rethink presence. Every time you pick up a HTC Vive, you are about to enter a virtual boundary that didn't exist before. Exciting and also scary!



