Augmented reality glasses like Google Glass have never had a stellar reputation among the privacy crowd. But a group of researchers believes that cyborg eyewear could actually offer a privacy upside in the form of a new kind of effortless encrypted communication—one where sensitive data is decrypted not on the screen of a vulnerable computer, but only in the eye of the recipient.

Researchers at the University of North Carolina have developed an experimental system of so-called "visual cryptography" designed to communicate secret messages to the wearer of an augmented reality headset. In the system they created and tested, information is encrypted in what look like random collections of black and white static. But when the recipient's augmented reality glasses overlay another random-seeming image over their vision, the two images combine to form a readable message.

Courtesy Sarah J. Andrabi, Michael K. Reiter, Cynthia Sturton

That system could, for instance, allow someone to unscramble encrypted text in a way that couldn't be spied on by an over-the-shoulder snoop, since the text is never decrypted on the reader's screen. Or it could be used to overlay a keypad with randomized numbers onto an ATM's display, so that no one watching could learn the bank customer's PIN as they typed it. "When you overlay the secret visual share, only you can see the final message," says UNC researcher Sarah Andrabi, using the technical term "visual share" to refer to each of the two indecipherable images that add up to a message. "That secret is now only for the user’s eye."

In their study, which the researchers presented at the Symposium on Usable Privacy and Security last month, they first tried having people decipher characters written in braille from two overlaid images, one on a computer monitor and the other on a Google Glass headset. While people were generally able to decipher the combined braille character from those two images, the researchers found that Glass's tiny lens was too limiting, and so switched to using an Epson Moverio, another Android-powered augmented reality headset with larger screens for both eyes.

Courtesy Sarah J. Andrabi, Michael K. Reiter, Cynthia Sturton

They then showed test subjects a series of images that looked like collections of black and white pixels, and asked them to decipher the images by looking through the Moverio headset, cheating slightly by having people rest their heads in a stabilizing frame. When the headset overlaid a second set of black and white pixelated images, every pixel that didn't match the one beneath it became part of a letter or number, and the subjects were able to reliably read the "secret" characters that emerged. See an example of those two images and the final resulting character below.

That augmented reality encrypted communication system worked, albeit at a limited rate: Out of 30 test subjects, 26 were able to read every character with 100 percent accuracy, and the other four achieved 80 percent accuracy. But each character took the subjects a median time of 8.9 seconds to decipher. That likely limits their visual crypto system to a theoretical curiosity, not a practical system for now.

The researchers, however, claim that their system could someday solve a more serious issue eternally plaguing encryption: that the computer decrypting a secret message can be hacked or otherwise compromised by eavesdroppers. With their visual cipher system, they argue, only a person's eyes and brain perform that decryption, and the decrypted text never appears on a vulnerable computer. Not even the augmented reality headset itself must be trusted; the headset wearer sees the final secret message, but the augmented reality device only sees an indecipherable component of the image. "Even if the device you’re using is compromised, it still won’t know anything because it's not actually doing the decryption for you," says Andrabi.

The researchers admit, though, that this claim of not needing to trust the headset itself has serious exceptions. First, the augmented reality headset would have to have any front-facing camera disabled or covered. Otherwise it could see everything the user sees, and reassemble the same secret message. And then there's the problem of how the overlaid images stored on the headset, which essentially function as decryption keys, are safeguarded. If the headset is connected to the Internet—or if the images are shared over an insecure communications channel to get to the headset in the first place—then they could be compromised by an eavesdropper and used along with intercepted messages to decrypt the wearer's secrets. "There are a lot of caveats here," says John Hopkins computer science professor Matthew Green. "If [eavesdroppers] get both the image you're looking at and the image from your headset, they’ll decrypt the message, and they’ll win."

Still, UNC's Andrabi argues that as augmented reality hardware becomes more mainstream—she points to the potential of Microsoft's much-hyped HoloLens—privacy advocates should look beyond the "glasshole" surveillance problem they represent and think about their broader potential to build security into a new kind of user interface. "As much as it presents challenges in privacy," she says, "Augmented reality also gives us these new opportunities we can go out and explore."

Read the researchers' full paper below.

Usability of Augmented Reality for Revealing Secret Messages to Users but Not Their Devices