Georgia Institute of Technology, USA, professor Thad Starner has been in the augmented-reality (AR) game longer than most—in fact, he coined the phrase in 1990. In a Wednesday afternoon session at FiO+LS 2019, Starner shared his views on the social impact of augmented-reality and virtual-reality (VR) technologies.

Talk the talk, walk the walk

Starner stepped on stage sporting Google Glass, a project that he’s been involved with as a technical lead since its inception. He began wearing a wearable computer in 1993, and he’s been using one continuously ever since—an expert in the field by any standards.

Starner kicked off the talk with a brief rundown of AR/VR tech and its history, pointing out that hyped-up, high-end demonstrations of this technology rarely coincide with real-world use. To illustrate his point, he cited the example of Pokémon Go—the ubiquitous location-based AR mobile game is relatively simple, but has grossed billions.

Micro-interactions

Starner argues that it’s the AR/VR applications that we can integrate into our daily lives that have the most social impact. The key to a successful AR device, he says, is cutting down on the time between intention and action—ideally to 2 seconds or less. If this time difference is more than 2 seconds, then, according to Starner, you’re less likely to use a device.

This is why the products that make the most money tend to cut down on the time it takes to complete a “micro-interaction.” These are the simplest things you can do with these displays, like telling the time, checking traffic directions, reading an email—basically all the things a cellphone can do.

To underscore this, Starner asked attendees to look up the time and then raise their hands; everyone in the room did so in less than two seconds. Then Starner asked the audience to raise their hands after they looked up where Mt. Pinatubo was located. Only four people raised their hands—and no one else bothered trying. “The time between your intention and your action,” he said, “inhibits whether or not you use a device.”

Face-to-face

Next, Starner discussed how wearable AR devices can benefit face-to-face interactions. As a high-level manager and a professor, he spends most of his time in spoken conversation. Yet, he said, “we have very little computer support for these interactions.” Why? Because it’s rude, and it puts up a barrier between yourself and the people around you.

Wearable AR devices can assist in these interactions as well. For example, while delivering a lecture to a classroom, Starner uses Glass to prompt his transition sentences and to control his slides. For face-to-face meetings, Starner and his class developed a program called Facecard that provides basic personal information, such as education or employer, for each member of the meeting.

Not only does this save time, it improves the human interaction. For example, doctors equipped with wearable computers can focus on interacting with and listening to their patients while still having immediate access to patient history in their line of sight. In another example, researchers outfitted Google Glass to pair with a user’s hearing aid and caption the user’s conversations.

For the skeptics who think that the AR glasses would be distracting or unnerving during human interaction, Starner cited a study where an interviewer read cues from a screen placed behind a human subject, and the researchers timed how long it took for the subject to realize something was off. Consistently, even after 16 seconds of the interviewer reading from the screen and not looking at the subject, the subject rated the interviewer as engaged in the conversation. “It’s possible to integrate this tech without people noticing,” said Starner.

Improving and saving lives

One interesting impact of this technology is improving accessibility. For example, wearable computers can be used to read the affect on people’s faces, which is helpful for those with autism spectrum disorder who may struggle with reading social cues. “It’s another tool in the toolbox,” Starner said, “not that it’s the right one every time.”

Starner also cited his work with first responders and using wearable computers for immediate, glanceable information in life-or-death situations. For example, a firefighter could say “Glass, show extraction diagram for a Ford Expedition,” and it would show them how to safely extract a person from a burning vehicle.

In a Q&A session at the end of the talk, Starner spoke to the idea that AR displays could also have negative social impacts. It’s feasible, he said, that this could be used to coordinate against people, such as during a burglary. “Any tech can be used for good or bad; it’s about encouraging good behavior.”

All in all, Starner views the road ahead for wearable AR devices through a rosy lens. “In the future,” he said, “we’re going to see social barriers being broken down because of this.”