Devin Reimer spent a lot of time just so you could enjoy a hot cup of coffee in Job Simulator.

You could pour the cup and then set it on the counter and watch it steam until it cools down to room temperature. Add some sugar or creamer and watch the color turn lighter. He estimates it took roughly 500 hours from him alone, and hundreds more from others at Owlchemy Labs, to make the liquid simulation subsystems in Job Simulator.

If you play the VR game by Alex Schwartz, Reimer and their team at Owlchemy Labs, you can boil water or flare a fire by pouring wine on a stove.

The fact that these types of features took so much time illustrates not just how cutting edge and challenging VR software design is, but how it creates new kinds of challenges for creators. When he set out to let people pour things from one cup to another in the game, Reimer didn’t realize he’d also have to address temperature or the color of mixing substances.

“Watching playtesters do a thing that they expect to work in real life and then seeing that it doesn’t,” said Schwartz, the co-founder and CEO of Owlchemy. “That’s how our to do list fills up is when people have an expected thing they want to happen — like they reach out toward something they think is grabbable and it’s not.”

Job Simulator – the 2050 archives is a VR game coming to move-around, hand movement-controlled VR platforms. The HTC Vive is first, then Oculus Touch for Rift and PlayStation VR. The third sentence of Owlchemy’s “About Us” reads:

We believe that interaction and using your hands is what truly makes virtual reality the most incredible place to build unique content that blows players minds.

While wearing the soon-to-launch Vive and using the controllers, players become an office worker, gourmet chef, convenience store clerk or automotive mechanic. Each carefully crafted environment is like a toy box with dozens of virtual objects to grab, hold, toss, shake, pour, or push with your hands. In the office worker environment, I reach out and grab a cup with one hand, press a button with the other and line up the cup underneath the falling, hot, brown liquid. The cup slowly fills with coffee you could “drink” by bringing it to your lips. You can watch it disappear in front of your eyes.

The game is instantly playful and silly, so it is easy to miss the technical achievement hidden under the surface, and what it says about the future of interactivity design.

Examples.

You could throw a cup of coffee, just make sure to have the hand straps on your wrist.

Mix different kinds of liquids.

While playing Job Simulator you don’t realize it took hundreds of hours to figure out how to make all this work. Even being able to pick up a cup and place it back down on a desk, while showing players uninterrupted visuals at 90 frames per second, is a huge technical achievement for 2016. For Austin, Texas-based Owlchemy, it can lead to even bigger things down the road.

“All our tech is insanely re-usable,” Schwartz wrote in an e-mail. “We plan on leveraging it for all future titles.”

Six years after starting Owlchemy and five years after Reimer joined Schwartz, the two are responsible for one of the first games built from inception for stand-up, move-around, hand-controlled, in-home consumer virtual reality platforms. For Reimer, building Job Simulator was a lesson in human behavior as endless playtesting revealed just how many ways people can break a simulation.

“You just want to meet people’s expectations,” he said.

Beginning And Ending

Owlchemy Labs got wired controllers for development in January 2015, “which blew our minds with its accuracy. This accuracy led to us pushing in the direction of going all-in on hand controls,” Schwartz wrote in an email.

The controllers translate natural human finger, hand and arm movement into VR. Owlchemy built the game in Unity, a game development toolset used by 1.1 million people each month. Building a VR game with hand controller support, though, meant using bleeding edge versions of software tools and in-development hardware to explore this new medium and build their experience.

In 2015 and 2016, the simple act of “grabbing” a cup in VR, turning around and placing it on a desk is an enormous technical achievement. There’s no actual table to push back against your cup. It’s inevitable your hand will drop through it. Figuring out what the cup should do took an enormous amount of time.

Then there was liquid to figure out. This is what happened when someone tried pouring a cup of liquid in an early version of Job Simulator.

“I used to be like uhhhhh, stop doing that!” said Reimer, joking about playtesting sessions that would reveal new tasks for them. “Now I just embrace it. ‘Ok, now we have to build this horrible new thing.’”

Solving the problem above was just the beginning. Again and again people found different things to do with a cup of liquid. Putting something in the cup or a book over its mouth to block liquid getting in, pouring liquid from one cup to another, heating and mixing liquids. Every one of these things resulted in unexpected behavior — liquid shooting off toward the wall when it should be falling to the floor, for example.

Then Reimer would set out to solve it.

The resulting game is likely to be seen by a great many people over the coming months as it is bundled with the HTC Vive, and represents a fantastic demonstration of move-around hand-controlled VR.

I’ll keep following Owlchemy Labs closely and look forward to playing all of Job Simulator in my own house. Look for more stories about the company in the coming days and weeks.