Happy Holidays! This is the last tech update of the 2017 year and what a year it has been. We have some new documentation for your reading pleasure over the holidays, updates for integrations and new information about Rift Core 2.0 and the introduction of Oculus Dash.

New Documentation

Building VR apps can be tough. We've overhauled our Best Practices guide with what we've learned about developing VR apps. This updated guide first introduces the high-level concepts you should keep in mind when designing your VR experience. The guide then dives into some specific ways to implement locomotion, user input, positional tracking, and much more. These best practices are intended to help developers produce content that provides a safe and enjoyable consumer experience on Oculus hardware.

You've been asking for a more detailed look at how to use the Platform SDK features in Unreal Engine. We've published comprehensive updates in the Platform SDK docs with information about how to use every Platform feature. Get started on the Unreal Development Getting Started page.

Unity Integrations

Within Unity, we have deprecated support for the 5.4 and 5.5 release channels, and strongly recommend all developers working in the 5.6 release channel use 5.6.4p2. We've added equirectangular support to VR Compositor Layers (mobile only). And, we've fixed a shader issue within the Unity Sample Framework in the Avatar SDK that resulted in very long import times.

Rift Core 2.0

Rift Core 2.0 introduces substantial changes to Oculus Home and replaces the Universal Menu with Oculus Dash. We plan to roll it out to Rift users with the 1.22 runtime in early 2018.

Adding Dash support to your application will produce a better user experience and we recommend doing so when possible. All the necessary resources are now available to add support and test it before the public roll-out.

Dash re-implements Universal Menu as a VR compositor layer. Have a look at the “Introducing Oculus Dash” video in our Welcome to Rift Core 2.0 blog post to get a sense of how it works.

Beginning with runtime 1.22, when users pause an application, instead of rendering the Universal Menu in an empty room, one of two things will happen:

If the application includes Dash support, the application will pause and the Dash menu UI will be drawn over the paused application.

If the application does not include Dash support, the application will be paused by the runtime and the user will be presented with the Dash menu UI in an empty room, similar to the way the Universal Menu is displayed in earlier runtimes.

When the Dash UI is active, the runtime will render tracked controllers in the scene to interact with the menu. Your application should pause and mute, and hide any tracked controllers it renders in the scene, so there will not be a duplicate pair of hands.

For more information on how to implement Oculus Dash for your application, visit our Unity, Unreal, or Native integration documentation.