The Future of VR Ergonomics

Software: Immersed Comfort Mode

One of the principles the Immersed team codes by is making software that works for you. The barrier to learning for users should be minimized, which is made possible by software augmenting a user’s experience to be more user-friendly. Similar to how one of our goals is to remove the need for using controllers altogether and instead enabling users to use their bare hands to manipulate the virtual world (by running our A.I./Computer-Vision software on your laptop webcam to detect your body, similar to an Xbox Kinect), the entire user-experience in the virtual productivity world needs to be optimized as well. For example, we could follow a user’s historical screen-brightness trends from their Mac to auto-inform optimal screen-brightness in VR, rather than having the user to manually adjust the brightness of each screen. Or, we could reduce necessary neck movement, range, & strain by increasing the user’s field of view sensitivity, making their monitors float into view, rather than expect the user to turn their head all the way to see a desired monitor.

Hardware: Project Half Dome

The team at Oculus is hard at work fixing the vergence-accommodation conflict with Project Half Dome. In today’s world, the screen’s focus plane in VR is at infinity, preventing your eyeballs from needing to converge when focusing on “closer” objects. This makes pixel-density an issue when focusing on “closer” objects that should have more detail. So as a solution, the Half Dome team at Oculus will use eye-tracking technology to detect pupillary convergence, and thus, adjusting the screen’s distance to the user’s eyes to increase/decrease apparent pixel-density. This will bring human eyes in VR closer to natural movement and functionality.