How do you show someone what it's like to be in virtual reality? It's a problem that has been bothered Owlchemy Labs, creator of the upcoming HTC Vive title Job Simulator.

They've experimented with green screens and mixed reality trailers, using much of the groundwork laid by Northway Games. The results were pretty amazing.

But that required a lot of external equipment and wasn't much of a solution for the players themselves. So what could they offer every player who wanted to try to capture video or stream themselves inside the game?

The journey to player-controlled VR streaming

It's helpful to see how far they've come. You can watch the game's first teaser for the game below:

"Looking back on it, the footage looked like a pile of garbage," Alex Schwartz, who describes himself as the CEO and Janitor of Owlchemy Labs, told Polygon. "I was just straight-up recording the unwarped eye that was presented to the screen via Steam VR. It was low quality, it was cropped weirdly ... looking back on that footage, I wish I could kill it with fire."

The problem is that human beings experience the world in a very distinct way. We dart our eyes around, and both our head and our necks move when we turn our heads. It feels smooth to our brains because we're used to filtering all that movement and jitter out, but looking at raw footage coming from someone in VR is intolerable on a flat screen. Everything moves too fast and is hard to follow.

"You don't dolly, the way you would if you were holding a camera for a movie," Schwartz said. "Your head moves very quickly."

We think we know what real first-person view looks like, but it turns out the version that works in games and films is heavily modified from how we actually experience the world. It's a weird fact, but capturing video from inside virtual reality shows us the limitations of actual first-person views.

"I said holy crap," he remembers. "This has to be a thing."

Schwartz created a sort of virtual, invisible camera that just followed the movement of the player's head while smoothing out the motion. This was purely an experiment that would allow them to cut trailers that looked a bit more professional.

It was a good solution, but around a month before launch they decided to add the smoothing feature into the game itself through a button press, so people could stream the game or capture video comfortably. Gaming is a social business these days, and making a game that could be streamed online well is a huge advantage. They made the decision to try to not only show the viewer's point of view, but the player themselves.

"So how can we get a view from the corner of the room ... like a security camera-style footage ... but what does that look like?" Schwartz asked. The first time they tested it he saw the amount of human emotion that was conveyed just by watching the player's virtual hands. "I said 'holy crap,'" he remembers. "This has to be a thing."

He modeled the VR headset where the head should be, and he began having someone play-test the experience of playing for an audience. He had someone look at the camera and wave ... only to be told the player didn't know where to look. Where was the camera in virtual space, exactly? They couldn't see it.

"So I just thought I should give the camera a physical manifestation in the game," he recalls. He took it a step further, and modeled a viewfinder for the camera so you can see, inside VR, what the camera sees.

"People then popped into the game, they saw the camera, and they waved at it. You know you're in camera, and in the shot, and it was just perfect," he said. "The next step was the players wanted to throw something at the camera or grab it, so we had to do that. There was no option to not do it. This had to be a thing. That's how it goes."

He implemented the features for spectator mode in the past couple of weeks, shot the video yesterday morning, and then picked up the phone to call us to say he might have solved the problem of spectating VR since I was complaining about that problem on social media.

"I think we have a solution," he told me. "So he were are."

This changes everything

In the final implementation you can turn on the spectator mode, see multiple cameras in-game, move them around with your hands to set up interesting shots using the in-game viewfinder for each and then stream from in-game and move from the point of view of each camera by squeezing the HTC Vive's controller.

It's not a static system; you can be your own director and editor, in real time. Then you use those cameras to address your audience during live streams or captured events.

Schwartz says he was surprised by how much emotion is conveyed by just the hands and head in this mode.

"You can see her jumping up and down in excitement in the video and, based on the position and rotation of her head and hands, your brain can put together the entire human body that represents that head and those hands with a scary level of accuracy," he said. "We're wired deep down to understand human body language."

This lets you show off exactly what you're doing in the game in a way that makes perfect sense to the viewer, while allowing you a lot of options for how to present yourself and the game. And it's going to ship with the first release of Job Simulator.

Enabling spectator mode does comes with a cost — it's a toggle that you have to actively turn on — and it uses about 10 to 15 percent more power to render the scene a second time for the stream or video, but the entire game has been built with optimization in mind. If you have a system that can run the Vive well, you should have the power needed to also stream Job Simulator.

"We've been building mobile games in Unity for the past years," Schwartz said. "We know how to optimize."

Job Simulator is packed-in with the HTC Vive, which launches on April 5.