NASA is embarking on a 15-year journey to Mars. Virtual Reality will play a key role.

NASA uses NVIDIA GPU technology, combined with Unreal Engine 4, consumer-grade VR, physical mockups and models, wearable technologies, and room scale tracking to create what the U.S. space agency calls a “Hybrid Reality System” that will provide an extremely immersive and realistic training capability at a lower cost than traditional “analog” field tests.

For Frank Delgado and Matthew Noyes at NASA’s Johnson Space Center Hybrid Reality and Advanced Operational Concepts Lab, the goal is to create a low-cost, scalable platform to enable an “out of this world” experience.

“Hybrid Reality is about incorporating the best elements of physical and virtual realities to get the best of both worlds,” Noyes elaborates. “Right now, Geforce-powered room-scale VR in the HTC Vive transports you, visually, anywhere in the Universe. You really feel like you’re there. Our objective is to improve the experience for your other senses so it feels less like headset and more like a Holodeck.”

How NASA Pioneered Virtual Reality

NASA’s Ames Research Center pioneered some of the earliest functional VR systems in the mid-1980s.

“Then private industry simplified, polished and added its own innovations,” Noyes explains, comparing the development of VR by private industry to the evolution of commercial space flight. “NASA can now reintegrate the robust tech into our systems.”

The next step, Noyes says, is to create a sense of weightlessness. Johnson Space Center’s Active Response Gravity Offload System (ARGOS) is an excellent candidate.

“ARGOS,” Noyes explains, “is a smart, robotic crane that offloads all or part of your body weight. The result is an experience simulating the reduced gravity environments found on Mars, the Moon, or the International Space Station.”

A system like this, Noyes says, allows astronauts to practice navigating around the outside of a spacecraft or along the rocky, uneven surface of other worlds. They can even practice drilling into a rock for a mineral sample on an asteroid or another planet.

Like a Game

To do this, NASA needs to visualize training scenarios in real-time with photorealistic graphics, accurate physics, and “multiplayer” capability, Noyes says.

This is where NVIDIA’s GPU technologies — such as VRWorks — combined with game engines like Unreal Engine 4 come into play. Key technologies like VR SLI and Multi Res Shading afford extremely high levels of performance, visual quality and interaction.

It works because a Hybrid Reality training mission can be a lot like a game. “In both cases, you must complete a series of objectives under various constraints, visualize the world around you in 3D and interact with the environment and potentially other people while doing so,” Noyes says.

Inside the International Space Station

NASA has created a series of technology demos to showcase what Hybrid Reality can achieve. NVIDIA’s high performance GPUs are allowing NASA to create hybrid reality experiences that literally take people out of this world. The ISS experience includes many modules frequented by astronauts, interesting systems like the Combined Operational Load-Bearing External Resistance Treadmill (COLBERT), named after comedian Stephen Colbert, and even an out-the-window view from the Cupola. Would-be astronauts can also see the Advanced Resistive Exercise Device (ARED), which allows astronauts to work out by pushing against ambient cabin air pressure with a pair of vacuum pistons.

These environments are more than just images, they’re interactive. NVIDIA PhysX powers the physics interactions users see inside this experience for more life-like animations. “We’ve shown this to astronauts who have actually been on the space station, who say the dynamic object behavior in microgravity, especially collision, is extremely realistic” Noyes says.

Getting Physical

NASA has taken this experience even further by 3D-printing low-cost mockups of tools and control surfaces. NASA then integrates these physical mockups with next-generation graphics engines, working closely with NVIDIA to optimize performance.

For example, NASA has created another Hybrid Reality experience reproducing not just the visuals, but the physical feel of taking a lunar rover for a spin. Rover operators can drive using a joystick that exists in both the real and virtual worlds. The “landscape was generated at the large scale using real NASA heightmap data from the Apollo 14 landing zone,” Noyes says.

Some assets found on this lunar scene include models of Neil Armstrong, Buzz Aldrin, the Lunar Module from NVIDIA’s own Apollo 11 Maxwell demo.

First Steps of a Long Journey

Technology like this is critical as we begin the long journey to Mars.

NASA’s path forward includes not only continuing scientific advancements aboard the International Space Station, but developing commercial cargo and crew systems for low-Earth orbit and completing the Orion capsule and Space Launch System, which will take humans deeper into the solar system than ever before.

With NVIDIA’s technology, the next steps in a long journey ending with humans setting foot on the surface of Mars will begin with astronauts training in these types of highly realistic, and immersive Hybrid Reality environments.