Motivation

With Blender Game Engine we would have a Free Software alternative to Unity for virtual reality demos with the Oculus Rift. Existing BGE demos could be ported easily. With BGE you can easily create and import assets to your demo.

Status of the Rift in Free Software

Since it’s release in March, the Oculus Rift has seen a rather good adoption in proprietary Game Engines. Source, UDK3 and primarily Unity have embraced the new VR technolgy early. Sadly the community was only given Unity and C/C++ as tools, so most current demos are done with Unity. Free Software like Blender was rather sceptical about implementation due to the proprietary licensing of the Oculus SDK. The SDK license demonstrates that open source does not equal free software.

Only a few Oculus Demos were available for GNU/Linux, due to there not being a release of the official SDK. Because of that proprietary demos like Team Fortress 2 also do not include Rift support. Also the Unity Demos are not built for Linux, but OS X and Windows only.

Including Rift input to your Blender Game Engine Demo

For everyone else, install it in /usr/local or write a debian package, etc.

To build python-rift, you need to run “./setup.py build”. You need to symlink the .so file to your ~/.blender directory.

Since it is in your Blender’s Python path now, you can initialize the PyRift object in BGE like this.

The rotation is acquired as an quaterion. Note that OpenHDM uses XYZW, but Blender WXYZ. I did get headaches not only from figuring that out, but also from the wrong rotation when I had the HMD on.

You can transform your camera like this.

If you have a better way to do it, please tell me. This was rather a quick hack, but pretty functional.

Nice…! The camera moves, for me with a pretty amazing low latency. Unreal didn’t have this low latency on Windows.

Rendering for the Oculus Rift in BGE

As we know the BGE supports various types of stereoscopic rendering. One of them is the once required by the rift basically: Side-By-Side. The only thing we need to do know is the reverse lens distortion transformation we can achieve with a simple fragment shader.

Different versions of this fragment shader appeared on the net. A good explanation of the method can be found on the FireBox page. Another version is the one included in the OpenHMD examples

Sounds good, huh? Yeah, but it didn’t not work. The fragment shader transformed the Side-By-Side rendering asymmetrically, so that the left eye was smaller than the right.

The interesting thing is that the output of the shader is symmetrical when rendered with other stereo options, including Above-Below and without stereo. I asked for help in a Blender Stackexchange post and on the Blenderartists forum . Moguri from the forum came up with this patch that fixes the issue. Hooray, Rift support is complete.

As I noticed, people were trying to achieve this in Blender and had similar problems, due to this bug.

If you want Oculus Rift rendering support, try my example blend file and apply Moguri’s patch to Blender.