If you want to build better head-mounted displays for augmented reality (AR), Moore’s law won’t hold you back—but the law of etendue will, a member of Microsoft’s AR research team told attendees on the last day of Frontiers in Optics+Laser Science 2019.

Bernard Kress, partner optical architect for the computer-software company, assured his audience that the informal “law” that has governed the roadmap of silicon-chip development for more than half a century will not keep optical AR hardware from becoming smaller and lighter over time. And while the basic physics of light puts limits on the optical design, researchers can squeeze the size of their AR, virtual-reality (VR) and mixed-reality (MR) systems by using hybrid optical lenses and designing ever more complex optical pipelines.

Early in the 2010s, Kress was the principal optical architect of Google Glass, the tiny head-mounted display that almost resembled spectacles. In 2015 he jumped to Microsoft, where he has been working on the first two generations of the company’s HoloLens mixed-reality smartglasses.

From Moore to etendue

Moore’s law, of course, is the 1960s-era prediction by Intel Corp. co-founder Gordon Moore that the number of transistors on a silicon chip doubles every 18 to 24 months. If that law held true in optics—in other words, if the size of optical components was halved every 18 months without any loss in functionality—a lens that was 10 mm wide in 2008 would have shrunk to perhaps 50 μm by today. That isn’t in the cards.

Rather, optical scientists must abide by the law of etendue, which states that the product of the beam diameter and the beam angle is a constant. In an analogy to the second law of thermodynamics, etendue can never be diminished, only maintained or increased. For wearable displays, the product of the display size and the numerical aperture of the optical engine (the lenses, reflectors, beam splitters and the like) equals the size of the eyebox times the sine of the field-of-view angle. (The eyebox is the 3-D space in which the user’s eyes can receive the AR image.)

Creating a wearable headset

According to Kress, AR engineers must weigh three overarching factors: the wearer’s physical comfort, the wearer’s visual comfort for close-up and far-away imaging, and an experience of multisensory immersion.

Kress took his audience on a whirlwind tour of the optical components involved in creating a wearable AR headset and developed over the last half-decade. In their quests to slim down and speed up the hardware, researchers have experimented with convex and Fresnel lenses, LED and liquid crystal on silicon (LCOS) displays, and flat and curved display panels (the latter called “birdbaths”). The AR and VR industry has not standardized on any one type of architecture or technology, and startups are jumping in and out of the marketplace. Indeed, a few of the companies whose products Kress listed when he began to compile his FiO+LS presentation went out of business before he finished the slide deck.

The most recent and second incarnation of the HoloLens, released earlier this year, resembles a sturdy plastic headband, with a goggles-type display up front and a rear-mounted electronics package about the size of a smartphone. The design distributes the mass of the AR system and brings it closer to the center of gravity of the wearer’s head, which, according to Kress, is a major factor in gaining user acceptance. (No one wants to spend a full day wearing gear that has all its weight on one side of the head and creates literal pain in the neck.)

Don’t expect to see the next version of the HoloLens at your holiday party. Kress said that Microsoft is developing the third generation exclusively for defense and enterprise customers, who may tolerate somewhat larger and heavier hardware as part of their jobs. Microsoft forecasts no great consumer demand for the still-cumbersome gear, but that may change in a few years, he added.