In Unreal 4, this meant using a SceneCaptureComponent2D render target system. We also used many of the available post process settings so that we could closely match the look of our regular in-game cameras. Since this would result in a 2D video, we could also leverage fancy effects like depth of field and lens flares that we normally can’t use in VR.

In order to simulate cinematic camera operation, we added controls to the Xbox controller including field of view angle, depth of field focal distance, aperture and a global exposure setting. We also had the ability to control the camera movement smoothing speed. Since humans are inherently jerky and imprecise in their movements, this was a way to simulate a “Steady Cam” type of system. This was useful depending on whether we were shooting close or far. The closer the shot, the more responsive we needed the camera to be.

We had access to some great hardware which is not necessary but certainly helped. We mounted an Odyssey 7Q+ wireless display capture device and a Teradek Bolt 1000 wireless display receiver onto a director’s mount along with a Vive controller and an Xbox controller. The weight actually helped to make the system move like a real camera and feel tactile.

For our next iteration of this system, we plan to use a shoulder mount system instead of the director’s mount. Or we’ll at least have something similar to the director’s mount attached to the shoulder mount. The Xbox controller will also be directly attached. This will allow one person to have full control while operating the virtual camera.

Our capture computer was running a GTX 1080ti with 64GB of RAM and an i7 5960X. We were almost always able to maintain a consistent 60fps capture rate.

Our captured footage was 1920x1080 at 60fps.