In a ritual I’ve undertaken at least a thousand times, I lift my head to consult an airport display and determine which gate my plane will depart from. Normally, that involves skimming through a sprawling list of flights to places I’m not going. This time, however, all I see is information meant just for me:

advertisement

advertisement

Hello Harry

Flight DL42 to SEA boards in 33 min

Gate C11, 16 min walk

Proceed to Checkpoint 2 Stranger still, a leather-jacketed guy standing next to me is looking at the same display at the same time—and all he sees is his own travel information: Hello Albert

Flight DL11 to ATL boards in 47 min

Gate C26, 25 min walk

Proceed to Checkpoint 4 Okay, confession time: I’m not at an airport. Instead, I’m visiting the office of Misapplied Sciences, a Redmond, Washington, startup located in a dinky strip mall whose other tenants include a teppanyaki joint and a children’s hair salon. Albert is not another traveler but rather the company’s cofounder and CEO, Albert Ng. We’ve been play-acting our way through a demo of the company’s display, which can show different things to different people at one time—no special glasses, smartphone-camera trickery, or other intermediary technology required. The company calls it parallel reality. Most people, even if they believe that this is possible, think it’s 10 to 20 years away.” Albert Ng, Misapplied Sciences The simulated airport terminal is only one of the scenarios that Ng and his cofounder Dave Thompson show off for me in their headquarters. They also set up a mock store with a Pikachu doll, a Katy Perry CD, a James Bond DVD, and other goods, all in front of one screen. When I glance up at it, I see video related to whichever item I’m standing near. In a makeshift movie theater, I watch The Sound of Music with closed captions in English on a display above the movie screen, while Ng sits one seat over and sees Chinese captions on the same display. And I flick a wand to control colored lights on Seattle’s Space Needle (or for the sake of the demo, a large poster of it). At one point, just to definitively prove that their screen can show multiple images at once, Ng and Thompson push a grid of mirrors up in front of it. Even though they’re all reflecting the same screen, each shows an animated sequence based on the flag or map of a different country.

advertisement

Judged purely as a technological magic trick, parallel reality is one of the most freakishly unexpected feats I’ve seen in years. Misapplied Sciences has been quietly working on it for half a decade, largely in stealth mode; when Ng tells me that one of his startup’s business challenges is the widespread impression that what it’s doing is fantasy, I believe him. “Most people, even if they believe that this is possible, think it’s 10 to 20 years away,” he says. The potential applications for the technology—from outdoor advertising to traffic signs to theme-park entertainment—are many. But if all goes according to plan, the first consumers who will see it in action will be travelers at the Detroit Metropolitan Airport. Starting in the middle of this year, Delta Air Lines plans to offer parallel-reality signage, located just past TSA, that can simultaneously show almost 100 customers unique information on their flights, once they’ve scanned their boarding passes. Available in English, Spanish, Japanese, Korean, and other languages, it will be a slicked-up, real-world deployment of the demo I got in Redmond. Even before parallel reality lands in Detroit, it will make an early appearance this week at CES in Las Vegas. Delta’s booth features a preview of the airport-info application along with other exhibits about the technology. Delta CEO Ed Bastian and Misapplied’s Ng are showing off parallel reality as part of Delta’s keynote at the gadget exhibition on Tuesday morning. Also part of Delta’s CES showing: improvements to its Fly Delta app (such as the built-in ability to summon a Lyft to take you to the airport), a partnership to test exoskeletons for Delta employees whose jobs involve heavy lifting, and the use of machine learning to help minimize schedule disruptions during bad weather. The fact that Delta is at CES to talk about parallel reality and other technological initiatives is a statement in itself. Judged by a bunch of typical metrics, the airline is thriving: it’s the most profitable U.S. carrier, with the best on-time record. The company is optimistic enough about its future that it’s investing billions in ambitious projects such as an all-new terminal at New York’s LaGuardia. But Delta says that it’s after something loftier than mere timeliness and financial success. “We’re looking to create a truly trusted and loved consumer brand in our customer’s mind, something that the airlines historically had not been known for,” says Bastian, who joined Delta in 1998 and became its CEO in 2016. “Travel is something that, unfortunately, in our country we have come to endure rather than to look forward to. Our goal at Delta is to make certain that travel is something that you look forward to as much as the destination you’re traveling to. And we think technology is going to be a key enabler of that.” Parallel reality will have to evolve further to play a crucial role in that transformation. But if the Detroit installation is anything like the sneak peek I got, some minds will be blown along the way.

advertisement

Birth of a brainstorm The parallel-reality story begins not at Misapplied Sciences but a different, slightly larger company that’s also based Redmond: Microsoft. At a January 2014 hackathon, a researcher named Paul Dietz came up with an idea to synchronize crowds in stadiums via a smartphone app that gave individual spectators cues to stand up, sit down, or hold up a card. The idea was to “use people as pixels,” he says, by turning the entire audience into a giant, human-powered animated display. It worked. “But the participants complained that they were so busy looking at their phones, they couldn’t enjoy the effect,” Dietz remembers. That led him to wonder if there was a more elegant way to signal individuals in a crowd, such as beaming different colors to different people. As part of this investigation, he set up a pocket projector in an atrium and projected stripes of red and green. “The projector was very dim,” he says. “But when I looked into it from across the atrium, it was this beautiful, bright, saturated green light. Then I moved over a few inches into a red stripe, and then it looked like an intense red light.” Based on this discovery, Dietz concluded that it might be possible to create displays that precisely aimed differing images at people depending on their position. Later in 2014, that epiphany gave birth to Misapplied Sciences, which he cofounded with Ng—who’d been his Microsoft intern while studying high-performance computing at Stanford—and Thompson, whom Dietz had met when both were creating theme-park experiences at Walt Disney Imagineering. (Dietz became Misapplied’s chairman and CTO, but left the company last year.) Meanwhile, as Misapplied Sciences was busy developing parallel reality, Delta was founding the Hangar, a corporate innovation center located at Georgia Tech’s Tech Square in midtown Atlanta. It launched in 2016, along with another in-house tech group, Delta Flight Products, which focuses on developing custom systems such as the airline’s wireless in-flight entertainment system. Delta charged the Hangar with finding inventive ways to address customers’ biggest pain points, and research showed that some of the gnarliest ones occurred not in the air but on the ground. “The airport can be a very stressful environment,” explains Nicole Jones, the Hangar’s global innovation leader. “It’s unfamiliar. There may be language barriers.” (Not every traveler is equipped with a smartphone to check flight details, and even some of those who are also consult airport monitors for information—or at least I frequently do.) As part of its effort to improve its airport experience, Delta was on the lookout for startups with technologies that might help. Around a year and a half ago, Jones says, “we heard about this company in stealth mode. And we had to see it for ourselves.” That company was Misapplied Sciences. And once Delta witnessed parallel reality, it was so smitten that it not only hatched a partnership for the Detroit display project but also participated in Misapplied’s $8 million Series A funding round in 2019.

advertisement

The technology has come a long way since Dietz’s pocket-projector experiment, but the basic principle—directing different colors in different directions—remains the same. With garden-variety screens, the whole idea is to create a consistent picture, and the wider the viewing angle, the better. By contrast, with Misapplied’s displays, “at one time, a single pixel can emit green light towards you,” says Ng. “Whereas simultaneously that same pixel can emit red light to the person next to you.” In one version of the tech, it can control the display in 18,000 directions; in another, meant for large-scale outdoor signage, it can control it in a million. The company has engineered display modules that can be arranged, Lego-like, in different configurations that allow for signage of varying sizes and shapes. A Windows PC performs the heavy computational lifting, and there’s software that lets a user assign different images to different viewing positions by pointing and clicking. As displays reach the market, Ng says that the price will “rival that of advanced LED video walls.” Not cheap, maybe, but also not impossibly stratospheric. For all its science-fiction feel, parallel reality does have its gotchas, at least in its current incarnation. In the demos I saw, the pixels were blocky, with a noticeable amount of space around them—plus black bezels around the modules that make up a sign—giving the displays a look reminiscent of a sporting-arena electronic sign from a few generations back. They’re also capable of generating only 256 colors, so photos and videos aren’t exactly hyperrealistic. Perhaps the biggest wrinkle is that you need to stand at least 15 feet back for the parallel-reality effect to work. (Venture too close, and you see one mishmashed image.) The version of the airport screen debuting this week at CES is an upgrade over the one I got in Redmond—in color, with much crisper typography—and Ng says that there’s plenty of runway for further refinement. For instance, 24-bit color—capable of rendering 16 million hues—is in the works. “With each subsequent generation, we’re significantly improving the visual quality,” he says. “Things like pixel density, brightness, color, reproduction, the bezel thickness.” The privacy factor Regardless of how good Misapplied’s displays get, their ability to direct different images to multiple positions in physical space doesn’t enable Delta’s travel-information experience by itself. The other part of the equation is figuring out which traveler is standing where, so people see their own flight details. Delta is accomplishing that with a bit of AI software and some ceiling-mounted cameras. When you scan your boarding pass, you get associated with your flight info—not through facial recognition, but simply as a discrete blob in the cameras’ view. As you roam near the parallel-reality display, the software keeps tabs on your location, so that the signage can point your information at your precise spot. It’s all going to be housed on Delta systems and Delta software, and it’s always going to be opt-in.” Robbie Schaefer, Delta Delta is taking pains to alleviate any privacy concerns relating to this system. “It’s all going to be housed on Delta systems and Delta software, and it’s always going to be opt-in,” says Robbie Schaefer, general manager of Delta’s airport customer experience. The software won’t store anything once a customer moves on, and the display won’t display any highly sensitive information. (It’s possible to steal a peek at other people’s displays, but only by invading their personal space—which is what I did to Ng, at his invitation, to see for myself.)

advertisement

The other demos I witnessed at Misapplied’s office involved less tracking of individuals and handling of their personal data. In the retail-store scenario, for instance, all that mattered was which product I was standing in front of. And in the captioning one, the display only needed to know what language to display for each seat, which involved audience members using a smartphone app to scan a QR code on their seat and then select a language. Still, it’s reasonable to wonder about yet-to-come uses of parallel reality and any privacy pitfalls they might unleash. On Misapplied’s website, the list of potential applications includes advertising “targeted to each viewer’s needs, interests, behavior, and surroundings.” Even if you’re not an utter cynic, that might lead to thoughts of Minority Report and Blade Runner 2049–two movies that depict dystopian worlds with creepy ads in public spaces buttonholing individual passers-by. Though Ng didn’t disclose any specifics about theoretical deployments beyond Delta’s installation when I met with him, he did say that Misapplied’s future “is about giving a better experience to every single person who is engaged with parallel reality. Privacy considerations are part of that. And we’re designing all of our experiences and the technology with that in mind.” He also stresses that what the company has created is a display technology, not a new means of monitoring people: “The system is agnostic to the tracking technology—all we need to know is a three-dimensional location that we want to send a piece of content to.” For now, there should be plenty of learnings just from Delta’s deployment of parallel reality at one spot in a single air terminal. Personally, I’m curious how flawlessly it’ll work in the chaotic environment of a major airport—with lots of random people milling about—rather than the controlled environment of Misapplied headquarters. Ng expects to be surprised by how travelers respond to it, too. “We know we need to do that testing in a real-world environment, in order to draw the right conclusions,” he says. “Most of the time, when people see the demos [at Misapplied’s office], the reaction is to the technology itself as opposed to the experience.” Delta, which is labeling its Detroit rollout as a “beta experience,” is anxious to get going. “We want to understand how customers respond to the technology,” says Jones. “We want to get feedback on how it’s enhancing the experience. And then from there, together, we’re going to learn and evolve and scale this across the environment.” And the airline does say that it’s serious about making further use of parallel reality as the technology matures. “It’s almost limitless, the product map,” says COO Gil West. “Curb to gate, for things like wayfinding, boarding, upgrades, service recovery if your flight’s delayed or canceled.”

advertisement

Whether this early exuberance pans out over the long haul remains to be seen. But as part of its CES hoopla, Delta is releasing a future-vision video that features a more fully evolved version of Misapplied’s display, (It also includes something that, at first blush, feels far more fanciful: a security-screening process you can complete by walking briskly through some stylish giant hoops, no lining up or patting down required.) Future possibilities aside, once Delta’s beta experience is up and running, it should be worth at least a few minutes for anyone who travels through the airline’s Detroit terminal and wants to see something genuinely new. Ng, however, remains insistent that the ultimate goal isn’t to wow anybody. Like many a potential breakthrough before it, parallel reality will matter most if we start to take it for granted. “It’s not about people appreciating the technology,” he says. “It’s about people going through these venues and getting a seamless experience. And when we can get to that point, where the technology blends away, that means we made it. That means that we were successful.”