How To Live Stream Mixed Reality

Introduction

With virtual reality rapidly gaining popularity, it’s becoming a great medium for generating live stream content. However, it can be hard translating the experience to an external viewer if you’re purely capturing the first person view.

The solution to this problem is Mixed Reality. Most people are familiar with Augmented Reality, which overlays computer graphics over real life. Mixed Reality is essentially the opposite, where you overlay real life over computer graphics.

The end result is a much more interesting point of view of what the player is experiencing in-game. You can see an animated example of this below for Google’s Tilt Brush.

Before we get started, I wanted to note that successfully getting everything working the first time will require a fair bit of tinkering and time. When I first tried mixed reality back in August 2016, there weren’t as many tools available. Luckily the VR and streaming community have released some handy tools since (January 2017 at the time of writing), but the whole process is still quite complicated.

1. Stage Setup

Required Equipment

Green screen, the larger the better 2x Box lights on stands

When setting up the stage for your mixed reality live stream you will need to consider how you want to shoot the subject. The more green screen space you have, the wider the shot you can take, which is essential if you plan on panning the camera. If the camera pans outside of the green screen, it will ruin the immersion for the viewer, as you will no longer see the in-game graphics.

The two box lights are essential for shooting with green screens as they greatly reduce shadows and creates an even green, which makes the chroma keying easier (removing the green background from the video).

*Update* I’ve written a blog post about the best way to setup a green screen for a live stream.

2. Hardware Setup

Required Equipment

VR ready PC Monitor that supports at least 2K resolution Video camera Video capture card HTC Vive *Optional* 3rd Vive controller *Optional* If using a 3rd controller you will either require a USB extension cord or a steam controller dongle flashed with custom firmware (additional information here too)

It’s important you have a machine that is powerful enough to run VR content at 2K resolution and at a constant frame rate. The reason for 2K resolution is that due to the way mixed reality content is captured, the output resolution will be halved. With a standard 1080p monitor, the final output resolution would be 960 x 540, which is quite small. With 2K resolution (2560×1440), the final output resolution is 1280 x 720. If you want a 1080p stream, then you will need a 4K monitor.

You’ll also need a video camera and a capture card to get the video signal onto your computer. We’ve used the Magewell HDMI to USB dongle with success. Alternatively you can use a webcam, however the quality of the video will be much lower.

If you wish to pan the camera during the stream, you will need a 3rd controller mounted to the camera. Alternatively you can keep the camera stationary and use a virtual controller, which I’ll talk about below.

3. Software Setup

Required Software

OBS (Open Broadcaster Software) Steam and Steam VR Most Unity based VR games Virtual controller driver (if you don’t have a 3rd controller)

If you don’t have a third controller, don’t worry! There’s a handy tool out there which lets you trick Steam VR into thinking you have a third controller. We call this a virtual controller. The only downside to a virtual controller is that the camera must remain stationary. If it moves at all during the stream, it will misalign the layers.

You can download the virtual controller driver here. Additionally you’ll need to follow this guide on how to correctly install the driver.

Once you’ve got the driver installed, and the externalcamera.cfg placed in the root directory of the Unity game you want to play, load up SteamVR. You should now notice the 3rd controller is being detected (hopefully). Once the 3rd controller is being detected, you’ll need to move onto calibrating it.

Virtual Controller/Camera Calibration

The purpose of the virtual camera calibration is to align the locations of both the virtual and physical cameras.

Method 1:

The drivers for the virtual controller include a tool called cameraAlign. When I first started streaming mixed reality content, this was the only tool available. It is extremely tedious to setup and requires a lot of trial and error. This step takes the most amount of time to get right.

You’ll also need to calculate the field of view (FOV) for your camera lens. It’s important to note that Unity measures field of view vertically. This tool can help you work out your lens field of view.

Method 2:

A YouTuber has created a tool called Mixed Reality Configurator, which makes it easier to generate the externalcamera.cfg value used for calibrating the virtual controller. You can download the tool here, and it also links to a video guide on how to use it.

4. OBS Setup

Now that you have a virtual controller setup, when starting a game make sure to hold shift and double click on the executable in the games directory. This will open up the Unity preferences screen. Make sure to select full screen and 2K resolution (2560×1440).

What you’ll see is the game screen split up into four quadrants. The top left is ‘foreground’, the top right is the ‘foreground alpha layer’, the bottom left is the ‘background’ and finally the bottom right is the first person view.

The guys over at Northway Games have an extensive guide for setting up OBS for mixed reality streaming, which you can read here.

A few things to note with their guide:

They built their own in-game method of creating the four quadrant layers. As we’ve already generated the four quadrants, we only need to follow point 2 of their guide. Instead of adding a webcam as a source, you’ll need to add the capture card instead. Their guide is based on a 4K monitor, so you will need to base your crops on a 2K monitor.

One thing not mentioned in their guide is the use of an alpha layer (top right quadrant) as a mask. You will need to download and install a plugin for OBS in order to do this. Version 17.0.0+ of OBS is required for it to work well. Update: This plugin is no longer available. It has been superseded by a stand-alone mixed reality compositor available here.

The plugin uses the top-right quadrant of the layer it’s applied to as a mask for the top-left quadrant of the same source. So you’ll need to apply this to the ‘Foreground’ layer. To do this Select the “Image Mask/Blend” filter and set it to “Subtraction”. Then you’ll need to select any image, which isn’t used in the filter, but is required for it to work.

Once you’ve followed all the above steps you should have a composited scene of all the layers (see screenshot of layer order). At this point you might need to recalibrate your virtual controller to make sure everything lines up.

So now you’re good to go live! If you have any questions about mixed reality, please leave them in the comment section below.

TL;DR

Make sure you’re a patient human being – mixed reality can be painfully frustrating to get correctly setup the first time. Film someone in front of a green screen playing a VR game. Define the location of the in-game 3rd person camera using a 3rd controller or with special drivers. When 3 controllers are detected and a special config file is placed in the game directory, Unity games will automatically output a source that is split into four quadrants. Foreground, transparency layer, background and first person view. Composite (combine) these layers with the raw video footage of the player in front of the green screen, using OBS. Essentially sandwiching the green screen footage between the background and the foreground. Live stream the results!

Further Reading: