As rumors swirl around Google's plans to announce head-up display glasses by the end of the year, the company has quietly begun advertising for a designer and engineer responsible for augmented-reality mapping.

The positions include a "special projects" front-end software engineer and a designer for local, mobile and social apps. Both job descriptions list augmented-reality mapping as a top responsibility. The designer position requires the ability to "integrate mobile platforms, augmented reality mapping, geo-location, and real-time interaction."

There's no evidence that these positions have anything to do with the rumored head-up display (HUD) glasses, but the timing is interesting.

Last week, a New York Times article reported that Google was allegedly working on a pair of HUD glasses to be released later this year. The glasses would look similar to Oakley Thump sunglasses, and provide augmented-reality data overlays about landmarks and even passers-by right in front of a user's eyes.

The story provided a provocative glimpse into the future of augmented reality, but created more questions than it answered. How will the glasses actually work? And can Google convince the public that HUD glasses and augmented reality are more than just niche technologies?

We talked with Blair MacIntyre, director of the Augmented Environments Lab at Georgia Tech, about the rumored glasses, and he got right to the point: "They will have to do a lot for people to consider buying them."

How Might They Work? ——————–

According to the Times report, information will be displayed on a "small screen that will sit a few inches from someone’s eye." A low-resolution camera will monitor the real world and "overlay information about locations, surrounding buildings, and friends who might be nearby." The glasses will allegedly work with a user's Android device, and will ship with a 3G or 4G data connection (that's right: get ready for yet another data bill).

The description above suggests two divergent (and conflicting) paths to information display. The reference to a "small screen that will sit a few inches from someone’s eye" evokes comparisons to Recon Instrument's Mod Live snow goggles. These have normal lenses, and HUD information is delivered via a small, discrete LCD screen that sits at the bottom of the right-hand lens. To see the screen, the user merely looks down and refocuses his attention. This isn't an "overlay" feature by any means, but it is a potentially safe (if also somewhat conventional) approach to HUD glasses.

The Recon Instruments snow sports goggles have a "micro LCD" screen at the bottom of the lower-right lens that provides a small window of HUD data. Source: Recon Instruments

But there's another much more sci-fi possibility (and it too was evoked in the Times article): All the augmented reality data will be displayed directly in the lenses of the glasses. The lenses would allow the user to simultaneously see the world at large as well as graphical overlays. This approach presents a manufacturing challenge. Indeed, how do you deliver augmented reality overlays via a lens that in and of itself must remain at least semi-transparent?

MacIntyre says, "The problem with transparency is how to do it in a way that actually works in lots of situations." Nonetheless, he notes that transparency could be accomplished in two different ways.

First, the lenses could take the form of semi-transparent displays that allow users to see directly though them as they would with regular glasses. Integrated with LCD or OLED display technology, the lenses would allow both ambient light and projected light to reach a user's eyes. MacIntyre said such a system bears the characteristics of a two-way mirror.

Because of their integrated display layer, the lenses would have a slight tint to them, and wouldn't be completely transparent. This basic technology can currently be seen in the Samsung "smart window display" concept as well as the Haier Designer Transparent TV.

MacIntyre cautions that this approach has drawbacks: As a HUD user moves from one environment to another, changes in ambient lighting directly affect the visibility of the data overlay. "The problem with the [integrated display] approach is that the blend of outside light and display light is kind of fixed," he says. "For the blending to work, there has to be an optimal amount of light coming in from the world. So if the world is very dark, you're not going to see much of it – but you'll see display. If the world is very bright, you're not going to see much of the display – you're going to see mostly world."

In order to alleviate theses problems, Google would need to implement a feature that measures ambient lighting, and adjusts the display overlay's brightness to ensure visibility. Augmented reality glasses that require the user to stay within a fixed spectrum of ambient light are fine for industrial and medical uses, MacIntyre says, but wouldn't excite many consumers.

But there's also a second possibility for overlaying graphical data on top of what we observe in the field: lasers.

MacIntyre says a virtual retinal display (VRD) could use lasers to draw images directly onto the retina of the eye. In fact, MicroVision is currently working on this technology for consumer-based wearable technology. Because the graphical overlays would be visible "in-eye" so to speak, there wouldn't be any concern about changes in ambient light conditions.

Indeed, convincing consumers that shooting lasers into their eyes might prove the trickier challenge.

Regardless, MacIntyre believes that Google could bring some type of visual overlay glasses to market in the rumored $250 to $600 price range if it's willing to eat the cost of R&D, and subsidize the price of the glasses. Google is already rumored to be working on optics technology, and will continue that research in its new secret lab.

Will They Be Safe? ——————

"It doesn't just matter where your eyes are. It matters where your brain is focused," says Adam Grazzaley, Associate Professor of Neurology, Physiology and Psychiatry at UCSF. Grazzaley studies neural mechanisms of memory and attention, and finds it concerning that users might wear HUD glasses while attempting other tasks – like, say, walking.

"Our ability to engage in goal-directed behavior is very sensitive to interference from our environment," Grazzaley told Wired. Moreover, said Grazzaley, users should be concerned about layering ever more complex stimuli and tasks on top of activities they're already engaged in. Walking and especially driving demand a lot of focused attention – if only because the results are so serious when accidents occur.

Grazzaley noted that even if Google were to forgo placing HUD displays directly inside the lenses – instead opting for discrete lenses a la the Recon Instruments approach – there could still be problems. "Just because it's not right in your field of view doesn't mean it can't have a distraction effect," he said.

Pranav Mistry, an MIT Media Lab researcher and one of the inventors of SixthSense wearable computing system, exposed a fundamental flaw with augmented reality: "The human eye cannot focus the same on two levels," he told Wired. "Having something overlaid in your eye – over the top of your eye – you cannot focus on the background at the same time. For example, if you want to augment something on an object that is far away from you, your eye has to keep changing focus."

Will They Actually Sell? ————————

Putting aside technical feasibility and even public safety, Google's rumored glasses still face a daunting challenge: commercial viability. Are these HUD glasses something consumers even want? According to Forrester analyst Sarah Rotman Epps, wearable devices are one of the top five computing formfactors to watch. She believes that the first generation of Google's HUD glasses wouldn't sell in any significant volume, but would still get developers and consumers thinking about wearable products.

"Google wants to spur innovation," she said, "and humanity will work out the essential issues over the long run."