The potential uses of a pair of smart glasses are easy to fixate on: simply glance to one side at the tiny screen and check your e-mail, make sure you’re walking in the right direction according to your maps app, see what the weather will be like later in the afternoon. But some realities about the glasses have yet to be addressed—for one, what we all might look like using these things.

We know generally what a human wearing these glasses (or at least the first generation of them) will look like thanks to Google and other emerging smart glasses companies. We know what the glasses can do. But we've neglected to consider how different we might look to people around us while we’re watching those little screens, seeing them do what they do.

During the week of CES, I visited the booth of Vuzix, a company that plans to release smart glasses that interface with your smartphone to let you view the display via a tiny screen clipped to your head. Eventually, the company hopes to run Android apps directly from the headset itself.

The design isn’t as sleek and minimal as that of Google Glass, which still isn’t due to reach customers’ hands for another year. But Vuzix plans to price its version at less than one third of the only price point Google has put forward ($1,500).

While at the booth, I asked a representative to put the M100 smart glasses on for me. I asked him to focus on me, like we were having a conversation.

Then I asked him to focus on the screen inside the glasses. The result:

After this initial minute, he went back to looking as if he were looking straight ahead, but there was a moment when he attempted to focus on the screen on one side of his face with both eyes. The look in the second picture above is due to “convergence,” where our eyes attempt to keep an object in focus by working together to maintain single binocular vision. A user probably wouldn’t always look like this while glancing over at the tiny screen, but at best it would take some practice to shift focus without apparently going cross-eyed.

When I tried out the headset, the screen was a bit difficult to see (for some reason, the demo units weren’t attached to a headband or any other structure for affixing it to my head, so I had to hold up the earpiece to my ear and pretend). The display seemed to float out in space several inches beyond where there actual glass was—appropriate, since it’s designed to look like a 4-inch screen held at 14 inches.

The screen is noticeably low-resolution (WQVGA) and the colors were not great. Besides gazing into the virtual abyss of the handset, I also couldn’t do much with the screen. As it is now, the headset has to interface with an Android handset and will display whatever is happening on the handset on the headset’s tiny screen. There are two navigation buttons and a power button embedded in the top of the earpiece, but to get any real navigation or selection done, I’d have to look down at the connected smartphone, which all but negates the value of having a screen set right in front of my eye.



Ultimately, a screen with which you can’t interact will be better suited to monitoring than management—for instance, keeping an e-mail app open to flick your eyes over and see if that message you’re waiting for has arrived, or, as we mentioned, mapping app directions. But the battery life will cut this short: currently, Vuzix’s smart glasses are pegged for 2 hours of a continuous "on" state from their batteries, or 8 hours with “typical” off-and-on use.

Smart glasses are still an interesting concept, but we have our doubts that they will ever fit into the lifestyle of mainstream consumers. Certainly, at this early stage, a short battery life and clunkier look won’t help these glasses. We have even more doubts about the way they look on a person. As we understand it, smart glasses are an item to be worn all the time, not unlike how Bluetooth headsets are never removed from the ears of the habitual businessperson.



But Bluetooth headsets worn in that capacity have presented a near-insurmountable social hurdle: it’s not always possible for bystanders or conversation partners to definitively determine who the wearer is talking to without explicitly asking. Smart glasses may present the same problem once we all learn to focus on them: are you talking to me or reading your Twitter feed? But if your conversation partner’s eyes try to converge on the screen, it’s going to make for an awkward moment during your chat when their eyes, ever so briefly, cross while trying to check the status of their e-mail or social networks. It’s the equivalent of glancing down at your smartphone in your lap, but so much more visually jarring.

We may be worrying about these issues too early: at around $500, these glasses will be hard to justify even for the most dogged of early adopters, so we’re hardly in for an unfocused-gaze-during-conversation pandemic. But it’s funny to think that we may be on the horizon of developing a new technological response twitch: the flick of the wrong eye to a tiny head-mounted display.