A decade ago imagining a small computer with a camera attached to your glasses that allows you to view an augmented reality was still the work of science fiction. In 2013 the term “glasshole” entered our collective vocabulary.

Augmented reality (commonly abbreviated as AR) is now a real thing. Google Glass brought AR into our collective consciousness, but AR has existed from the day that a smartphone shipped with a camera. But thanks the negative press Google’s (NASDAQ: GOOG) AR product has brought into the public sphere, the debate about privacy and ethics in the AR world have hit a new tenor.

While the debate around Google Glass largely stemmed from the potential to create a ‘panopticon’ of sorts with head mounted cameras now becoming ubiquitous, there’s also another real, valid privacy concern with AR: layering of data. A rouge app for instance could map people’s Swarm (Foursquare) checkins in the vicinity and cross-reference that with their Facebook and Twitter profiles. Combine that with a feature to look up that person’s history of checkins and you’ve got a serious privacy problem.

Consider the image below, from AR Lab. This shows the extreme potential of the collision between privacy and AR:





Now consider the underlying problem with this dystopia. It’s a problem with not the medium, but rather the message. This dystopia is only possible because of the amount of publicly available data that we share, both willingly and unwillingly. Consider the data you would be able to compile on a person by going through what’s stored on them in various databases both public and private. Now consider the Orwellian hell you’d create by layering this data into an augmented reality system.

The solution to this problem is to ensure strong data hygiene laws exist to ensure that data is not being misappropriated and used for nefarious purposes. This will require a proactive effort on the part of the companies’ that store the data, as well as by users themselves. Thinking ‘does this need to be shared?’ would be the first step. AR should not be feared or rejected, but as long as there are smart privacy policies and practices in place it should only be embraced.