Mixed reality has the potential to transform our lives, but there’s a lot that we don’t know about the technology and its impact on people and society, particularly when it comes to privacy and safety issues.

These technologies rely on constant, real-time analysis of users and the space around them. Unlike a phone or computer, you can’t just tape over the camera and still use them — without spatial data, AR/VR experiences could cause cybersickness or accidentally make users walk into walls. The sensors that are essential in making VR/AR apps work can pose a significant threat to user privacy.

That’s why privacy is an existential question for spatial computing — can we use mixed reality while maintaining privacy and agency? We need to figure this out before these devices and apps become a part of our daily lives.

{Virtual, Augmented, Mixed} Reality

What do we mean when we talk about virtual and augmented reality? In virtual reality, you’re entirely immersed in a digital world. In augmented reality, digital objects appear as if they’re in the physical world. Both AR and VR use similar underlying concepts like spatial computing and 3D elements, so we bundle them under the umbrella term “mixed reality” (XR, or sometimes MR).

Mixed reality experiences require a bunch of sensors (including cameras) to be on for extended periods of time in order to function. Mixed reality headsets are covered in sensors — not only cameras, but also infrared detectors, gaze trackers, accelerometers, and microphones.

XR is already being used for education and emergency first-responder training, and it’s going to continue to grow. Before this technology becomes as common as the smartphone, we need to think about how it can be built for safety and privacy.

Training for active shooter/other rare emergencies

There is a reason why flight attendants review safety procedures and schools have fire drills. When you’re prepared for an emergency, you can handle things better. But, drills can only prepare us for a fraction of the real experience. With XR, we can immerse emergency responders in situations that closely resemble real life. Being in a more lifelike experience can prepare responders for a real emergency, and it can also highlight any holes that need to be filled.

We know that VR can affect brain processing and psychology. Some VR applications are already being used to help treat PTSD via exposure therapy. Is it also possible to induce negative effects? This is something we need to study.

Online social spaces

Social VR experiences allow us to connect more directly with friends and family, even if they live across the world. Not only does social VR provide space for stronger connections, but it also allows you to take on a new virtual persona. You can quite literally become someone else, even a cat.

Taking on a different identity in the virtual world is appealing for many reasons, one of them being potential online harassment. Social VR combines the worst of both in-person and virtual harassment — abusers are protected by the anonymity of the Internet, while their targets suffer more due to virtual embodiment. Getting trolled on Twitter is already unpleasant, and unfortunately, or victims of online harassment, this means they may feel the effects more viscerally.

Image Credit: Mozilla

The way we interact in a virtual space is unique to each of us. While this could be used as an authentication mechanism, it also poses a fingerprinting risk — anything that can be used for authentication must be uniquely identifying. Imagine if you use a VR headset in your job, then go home and enter a social VR space to fundraise as a volunteer for a political campaign. What if your employer can link your work persona to your personal avatar, simply based on your unique motion and interactions with virtual space, and that employer disapproves of that campaign?

This type of fingerprinting allows even more severe privacy violations than current device fingerprinting methods, which use specific information about the hardware and software you’re using. Without privacy, we lack agency and the ability to fully express ourselves.

Education

Education is another emerging application. Students can ride a virtual version of the Magic School Bus, exploring space without ever leaving the classroom. These lessons can lead to increased engagement, especially for special needs children. However, we don’t know the psychological effects of immersing kids in XR.

Unlike adults, children are still learning how to distinguish between fantasy and reality, and their nervous systems are still developing. Because of this, kids are especially sensitive to potential risks in immersive experiences.

And it’s not just young children that we need to be concerned about. In the outdoor augmented reality game, Alien Contact, older students (aged 11-16 years) asked researchers if aliens had actually crashed at their school and if the researchers were FBI agents (Dunleavy et al., 2009). -Dr. Erica Southgate

How does participating in immersive experiences affect developing brains? Are there differences between the effects on children and adults? Is it possible that an immersive experience can have physical side effects after exiting? These are all questions that we don’t have answers to yet.

With these studies in mind, we need to consider both the effects on children and the privacy implications. Most societies recognize the need to treat children and adults differently when it comes to data processing and collection. XR experiences need to process data in order to work and headsets generate large amounts of data, regardless of the user’s age. How can we enable educational opportunities, while making wise choices about data collection and protecting children’s data?

Looking to the future

We must work together to make mixed reality a safe technology. As with any new science, there’s a huge potential for good, but there are risks that go beyond the physical and mental impacts of XR immersion.

Mitigating risks is a part of the process, but unfortunately, it’s a step that’s sometimes overlooked. Having conversations about the potential risks of mixed reality is already a step towards safety. Developers can consider different scenarios, such as what happens when those who don’t have good intentions use this technology, and then design the technology with additional privacy and safety measures.

This is perhaps why many prominent technologists and even companies are raising their voices in favor of more mindful technological development. The Center for Humane Technology is an independent non-profit that seeks to drive more humane development centered around the impact on mental health, the breakdown of truth in society, and digital addiction, among others. The founders hail from Google, Mozilla, the CIA, Apple, and Microsoft. These are people who fight for innovation but who all recognize the need for research and responsibility.

Tony Fadell, the founder of Nest Labs and inventor of the iPod, has said, “Did we bring a nuclear bomb with information that can — as we see with fake news — reprogram people? Or did we bring light to people who never had information, who can now be empowered?” Fadell was referring to the smartphone specifically, but more generally to the unknown impact of new technology that has the power to distort users reality — impact mixed reality may have even stronger than the smartphone, with even more potentially negative results

Instead of writing about our regrets again in another five years, we need to start addressing these issues right now. As far as concrete steps, developers can use APIs that provide already abstracted data (like the geometry of the room), instead of using raw camera access to analyze the area around a user. In short, they can build for privacy first. As mixed reality evolves there will be other solutions, but in order to find them engineers first need to start asking questions of themselves, their users and of the true purpose of their technology.

Diane Hosfelt is the privacy and security lead for Mozilla Mixed Reality.