For the past couple years, the annual ACM SIGGRAPH graphics conference has hosted a lineup of some of the best narrative virtual reality experiences of the year in the VR Theater program. As part of this program, a jury selects a set of shows from submissions to present in sequence to attendees of the conference, and give them a taste of the latest in immersive storytelling. Past programs have slowly scaled up, to about 30 “seats” per showing.

This year, we had 54 seats. This is how we did it.

Statistics

First, some stats to give you an idea of the scale we managed to achieve this year, the scope of the program, and its unmatched ability to introduce large groups of people at SIGGRAPH (many of whom have never tried VR) to the experience line-up.

Mainstage

This year, in the Mainstage, we had 5 narrative experiences, and a networked multiplayer “lobby” experience that controlled the play order of the show, for a total of 6 VR experiences.

There were 54 seats in the theater, in a semi-circle layout facing the center of the circular space, featuring Alienware PCs sporting Nvidia RTX 2070 graphics cards, brand new Oculus Rift S headsets, and Bose noise-cancelling headphones. The space also featured a circular particle projection, lighting, and custom music and sound design for ambiance when entering/leaving the space.

We hosted a total of 30 showings with 50 minutes of content over 4 1/2 days of the conference. Every showing sold out within 30 minutes of ticket sales opening each morning at 8am. That’s 1,350 hours of VR content served to 1,620 attendees over 4.5 days.

Kiosks

VR Theater Kiosks in the Immersive Pavilion at SIGGRAPH 2019

In addition to the Mainstage, we hosted 10 additional curated experiences in 6 separate VR Theater Kiosks.

The Kiosks saw 170 people on average per day for another ~765 attendees (give or take for any overlap in attendees) total.

In total, this year we served 10,485 narrative VR experiences to ~2,385 people in 4.5 days (~12% of total conference attendees), making for the largest location-based narrative VR experience on the planet. In short, at present, VR Theater is the best way to get your narrative VR piece seen by as many people as possible, very quickly.

The Theater Space

The outside of the 2019 VR Theater, humans for scale

The inside, looking from the entrance

The “real world” theater space consisted of the 54 seats setup in a semicircle inside of a tall circular, velvet curtain-wrapped theater space (a towering presence, easily visible from anywhere in the experience hall or exhibition), with an interior circular particle projection and ambient music. The desks were designed to mostly hide the computers and monitors from the attendees so they wouldn’t get too distracted by the implementation, and get immersed instead of the world they were about to enter.

You really can’t miss the VR Theater space, wherever you are in the experience/exhibition hall

As attendees entered the space, welcome music and voice over was played, and attendees were helped into their seats by student volunteers.

Entrance sequence

A SIGGRAPH Student Volunteer helping an attendee get their headset on. We had 26 volunteers in the main stage and 6 in the kiosks every showing. This would not be possible without them.

As the experience ended, another track and animation was played to thank them for coming and preparations began for the next wave. The showings were back to back every day.

Scaling Setup

For 2019, VR Theater Chair Maxwell Planck wanted to increase the amount of seats significantly by 50%, since VR Theater always sold out early every morning (the most common complaint from attendees) and we wanted more people to get to see the experiences. He also wanted our “lobby” experience to be networked multiplayer.

Attendees lined up at 8am to get tickets, they would sell out by 8:30am.

This had the potential to truly elevate the VR Theater experience, but it also presented some interesting new challenges. One of those challenges was scale. Virtual reality devices and software are rather complex and have a lot of setup prerequisites and calibration that need to be done. Additionally, a few weeks before the conference, we found out we would be getting new Oculus Rift S devices instead of the previous generation Rift, which were easier to setup, but also an unknown for our team since they were so new.

Therefore, we needed to automate as much of the experience as possible in order to allow for operational bandwidth for troubleshooting, re-calibration of headsets, and helping attendees who may never have used virtual reality before. In addition, many of our experiences in the line-up were real-time rendered / interactive binaries, each with their own operational modes, adding to the run-through complexity.

All setup of machines was done remotely over the LAN network via automation scripts. This included installing Oculus software, updating Nvidia drivers, updating USB drivers, installing our experience line-up binaries and our lobby experience binary. Any time a new setup step was needed on one computer, we deployed it to the rest of the machines with automation, so we never needed to repeat most setup steps ourselves. This also made swapping a bad machine with a new one easy, as all you needed to do was plug it in and set the computer name to a convention (SEAT21, for example), and the automation software would take care of the rest of setup and activate it automatically once it was ready.

The only manual operations we needed to do were setting up Windows on fresh machines (if we had imaged them with Fog as planned, this would have been unnecessary, but that fell through onsite) and logging in to Oculus Home, for which installing TightVNC servers via automation and logging in via the master computer helped to speed up the process.

We used this orchestration capability to do maintenance tasks as well, such as cleaning up processes left running between showings and shutting down the machines at night. At the end of the conference, we ran a teardown script that removed all of our installations in ~30 seconds, allowing us to start striking right away and finish full teardown in about 3 hours (minus the space itself and rigging, as that was handled by conference contractors).

The Lobby

Last year’s VR Theater featured a “lobby” experience that attendees could sit in until the experiences were kicked off by student volunteers. For this year’s lobby, our chair wanted the experience to be networked, so attendees could see each other in our virtual world while they waited for us to launch them all into the experience line-up at once.

The lobby from an attendees VR perspective

We built our VR Theater Lobby in Unreal Engine 21.2, and used the brand new Niagra particle system to design a 1:1 replica of the actual VR Theater space in virtual reality. That way, the attendees could feel like they were entering a parallel world in the same environment, before being whisked away into the individual story worlds. We also gave them interactive particle controllers to play with while they waited for the experience to start. Each lobby client placed themselves in the correct location relative to the physical world using a computer name convention mapped to the seat number (ie. SEAT1, SEAT24, etc.), which made swapping in a new machine for a seat an easy process.

Master Control interface on a machine at the center of the theater space

Since the experience was networked, everyone could see each other’s heads and hands as particle balls, and the run-through into the line-up was centrally controlled from a “master control” interface. Using this interface, we could start the experience line-up on all computers at once, monitor the status of each seat, and even see each miniature attendee play around with their particles.

Lobby Master Control Interface

Once everyone was in their headsets, we started the experience on all machines simultaneously from the master control, and the lobby handled the rest, automating the pass-off to each experience as the previous one ended.