Storytelling in VR:

They say you shouldn’t hijack the head-tracking data stream of the Oculus Rift; visuals should not be separated from the human vestibular system…but rules were meant to be broken. Why? because there’s so much more to VR than gaming.

This is not to say games aren’t becoming movies! I found myself strangely immersed in Naughty Dog’s “Last of Us“, more than any tent-pole movie I’ve seen in the past few months. Such is the power of CG movies, un-canny valley be damned.

Defining a language for Storytelling in Virtual Reality:

You know how it all began oh so long ago (OK, 4 years ago) when the language of film-making was being defined / re-written for S3D. Well, time for a re-write again. Immersive 360 film-making is set to explode; geared for an audience of teens to mid forties – at least at the start, and telling stories in this medium is quite a different skill-set to master.

Citizen Kane, back in the day, although a 2D film, had given enough clues to modern 3D film-makers on how to effectively use the medium of S3D… but no one really had the patience to listen. Lighting, Depth of field and yes – even hijacking the head-tracking stream can work when creating movies on a 360 canvas.

When I started investigating this exciting medium a few months ago, alarm bells would go off when I asked on Oculus Rift / Game Engine forums about intercepting head-tracking and orientation info of these devices, but that’s because so far it’s only games that have been designed for VR. It’s soon becoming evident that apart from the gimmicky interactive look-around voyeuristic possibilities offered by the medium, serious Directors and storytellers will look at retaining control of the “frame” if they are to be enticed into creating movies in Virtual Reality.

So what could an immersive 360 Director’s tool-box look like?

Lighting – With the temptation to look around a scene, a Director and VR DoP can use the age-old technique of spot-lighting areas of importance.

– With the temptation to look around a scene, a Director and VR DoP can use the age-old technique of spot-lighting areas of importance. 360 Positional Sound – Wait until Dolby Atmos gets interested – Chances are an Atmos SDK might already be in the works to create scound-scapes that can aid in directing an audience’s attention.

– Wait until Dolby Atmos gets interested – Chances are an Atmos SDK might already be in the works to create scound-scapes that can aid in directing an audience’s attention. Depth of Field – The pet peeve of Steresoscopic 3D film-making, unless done correctly. (Bokeh) This technique is worth exploring in an immersive 360 environment, to guide audience attention. At least it won’t be a lead-by-the-nose experience, as it’s sometimes abused by inexperienced DPs and Directors on 2D films.

– The pet peeve of Steresoscopic 3D film-making, unless done correctly. (Bokeh) This technique is worth exploring in an immersive 360 environment, to guide audience attention. At least it won’t be a lead-by-the-nose experience, as it’s sometimes abused by inexperienced DPs and Directors on 2D films. Limiting the Horizontal FoV – There is no rule per se that every scene should feature full wrap-around 360 views of the scene for the audience to explore. The horizontal field of view can be restricted for certain shots. This is a creative call, and is what will contribute to the flavor of the overall movie experience being crafted by the film-maker.

[youtube id=”-sw5LwW8sFc”]

Advanced Tools for Immersive 360 Storytelling:

The short demo scene above is from MAYA – a mixed media Motion Comic I’m working on for the Oculus Rift and other VR devices, including Cell phone VR, such as Google’s Cardboard, Durovis Dive and others that will allow almost any smartphone; andriod or iOS, to playback VR movies and experiences.

The demo clip is a straight video grab of a wearer viewing a “page” of the motion comic via the Oculus Rift. It features both, scene cuts and subtle interactivity.

Interactivity: In the first scene after the title, The girl stands at the window – that’s what the Director intends the audience to see, and the rest of the room has subdued lighting. That is… unless the wearer turns their head around, which triggers the bedlamps to increase in intensity.

Head Tracking Hijack: Hijacking the Head-tracker – The next scene shows the girl framed on the bed. This cut will happen, ir-respective of where the wearer of the Rift is looking. Yes, it is a forced Cut, and will put the scene bang center.

The important point to be aware of, is this – The same rules for S3D storytelling apply; mis-matched depth splicing should be avoided.

GreenScreening the Crew out:

This idea came to me when I glanced at the image of what I later realized was a paratrooper in the movie. I initially thought they had covered crew/equipment in green, for later keying/wire removal. While I have not looked at the actual feasibility of stereoscopically replacing a background plate after removing any green-screen clad crew or equipment – I am confident that it could be possible, even when dealing with de-warping and stitching the 360 image.

Compositing in 360:

Below is an interactive 360 “cubic” panorama. It was converted from an Equirectangular image to cubes that form the Panorama. (Click and drag, or if browsing this page on an android/ios device, the gyro will work).

Images of the 6 individual Cubes are here:



Again, there’s every reason to believe, a competent NUKE programmer-artist could write the warping matrix to composite elements directly over a spherical or equirectangular sequence of images or video.

Jaunt VR – the people behind The Mission VR are certainly proficient in the algorithm and Cuda coding department, to pull it off.

(image credit: RoadtoVR.com)

Virtual Reality Film making Gear:

Capture: The current camera system used by Jaunt VR is said to be a rig based on 14 GoPro cameras in stereoscopic config to capture a wrap-around view of a scene in S3D. Now – I’ll admit i’m sitting on the fence about actual “scanline” level CMOS sync of go-pros in a stereo config, much less 7 pairs of them!… yet, I’ll give the rig and the experts behind it, the benefit of the doubt. It’s possible that sync drift is non-existent in today’s oscillators/controllers.

Edit: I’ve been told, JauntVR aren’t capturing pano stereo video in the traditional manner, but instead every point in space is seen by a minimum of 3 cameras in the rig, and thus a virtual stereo rig can be created which can then produce a final stereo pair using “Computational Photography”. This is said to give rounded stereo across the fish-eye field of view, and gets rid of what I call ‘fish-eye shear’ – at the periphery of view. For narrative filmmaking, I’m seriously looking at “graduated stereo fall-off (SFO) as a viable tool in directing audience attention.

Stereo 3D Conversion Houses: 360 Film making might actually put the spotlight on 2D to 3D conversion. – A much needed service for film-makers and a new Business avenue for conversion studios to explore! Compositing in stereo 3D on the stitched 360 image is only a few Nuke nodes away for competent conversion houses.

Software:

[youtube id=”w3kJ9BZz_-8″]

For MAYA – a 360 Motion novel, I’ve chosen the excellent Cinema Director from Cinema-Suite. It’s the closest one can get to an NLE system for an otherwise scripting heavy Game Engine such as Unity. Yes, to create immersive 360 video there is no straightforward way to do it in any of the popular video editing solutions. There is a bit of scripting in Javascript or C# to do anything creative with Unity or other Game Engines.

I had to customize the software to a certain extent to get the desired tools I needed to do those cuts for the Oculus Rift.

Most NLE systems were late to catch the Stereoscopic 3D train, the same will probably be the case for 360 VR support. Toolkits will need to be written to control different aspects of a VR experience. So what would a time-line look like for an immersive 360 VR film? – Take a look at a screenshot for MAYA, below.



(click for larger)

Immersive VR eye-wear:

Credit for the device that started the VR revival would go to the Oculus Rift. The recent acquisition of the Oculus Rift for a staggering $2 Billion by FaceBook should prove that VR is here to stay. Equally important are initiatives and products coming out from Sony with it’s Morpheus eyewear, Samsung’s own venture, and a special nod to long time VR headwear experts TDVision and their soon to be released ImmersiON VR eyewear.

Smart phones can easily be converted to immersive 360 video playback hardware. The Durovis Dive, and even Google’s “Cardboard” show how low cost this could be.

Immersive 360 Film-making: Content

The hardware and software is becoming available at a quicker pace than the talent to produce Immersive 360 content. Are audiences ready?

Gamers have always wanted beefier gaming rigs (laptops), and the same holds true for their screens. With VR devices, it’s like having an Imax® strapped to their faces. Yes they are ready – and they’d like to watch their movies the same way. But VR is not limited to gamers. The luxury of an immersive large screen environment and the privacy and intimacy it offers cannot be discounted. A long haul flight is but one venue that comes to mind where VR eye-wear would be in high demand…

Education is another. Already 360 documentaries, complete with Sir Attenborough’s voice are being readied for when these devices go mainstream….

As a film-maker, will your storytelling skills evolve for the next generation of audiences?

**Edit 11th August 2014**

Depth of Field – Bokeh-ing the background:

On the Oculus Rift forum as well as in the comments section of this article, the question of Bokeh came up, so to clarify with example:

The concept of using Depth of field (a practice which I do not condone and label as lazy filmmaking; see the Citizen Kane argument above) can actually be used to great effect if the background is completely Bokeh-ed out. i.e there has to be no ambiguity for the eyes/brain to even attempt to fuse semi blur imagery, which would otherwise cause eye strain and headaches in stereoscopic 3D viewing, and especially more so in a stereoscopic VR environment.

To show an example of what I mean, below is the same Interactive Panorama with only the Director’s framing and ground in focus…exactly as the Director might intend this scene to be viewed. The example below is obviously a quick photoshop lens filter job for illustration purposes, and a proper artist will spend time crafting a proper bokeh filter to apply to scenes. Use your mouse to simulate head turning when wearing the Oculus Rift.

Part 2: The Language of narrative storytelling in VR is now online.