The first sighting of the dancing hot dog in the wild was in June. By the Fourth of July, it had made its way around the world, breakdancing at bars and barbecues, at weddings and bar mitzvahs. It turned otherwise banal videos of the grocery store into cinematic masterpieces starring the hot dog, surmounting the refrigerated Oscar Mayers like a pile of carnage.

Over all, the dancing hot dog—one of Snapchat's World Lenses, which superimpose digital 3-D objects over the real-life surroundings—has sprung to life more than 2 billion times on the platform. And over time, it's taught millions of people to stitch together the digital world with the physical one.

The AR-first future is not quite here yet. But when it arrives, it will upend the way we see the world.

The companies behind the push into augmented reality have promised us headsets and glasses, camera-enabled tools to give us cooking tutorials, to help us assemble our Ikea furniture, and tell us ancillary information about everything we see. But that future hasn't quite arrived. For now, the first glimpses of augmented reality exist mostly on our phones, in a handful of games, apps, and in places like Snapchat. And for millions of people outside of Silicon Valley, who aren't waiting with bated breath for Magic Leap headsets, Snapchat is quietly teaching them how to love augmented reality.

Lens Weight

From the very beginning, Snapchat has considered itself a camera company. Not a messaging app. Not a space for exchanging bizarro selfies. Not even a social media platform. It has at times stretched the boundaries of what that means, building in a Discover tab for interactive news stories and adding in real-time mapping features. But fundamentally, Snapchat has always been camera-first: You open the app, and you're looking through the lens.

The platform's most iconic camera feature came in 2015, when Snapchat introduced Lenses: the selfie-enhancing tool that adds a digital overlay on your face. Using the front-facing camera in the app, Snapchat's facial mapping technology could register your face, render a 3-D model, and drape the digital image over it, transforming you into a dog, or zombie, or just a better-looking version of yourself. Not everyone who uses Snapchat has bought into features like Maps, Discovery, and certainly not Spectacles—but Lenses soon became synonymous with Snapchat itself. In many ways, those filters taught people to how to selfie—look! you can Face Swap with Mount Rushmore!—as much as they trained gave people their first glimpse of augmented reality.

Then, in April, Snapchat launched something new: World Lenses. They worked just like the original lenses, except that they used the rear-facing camera to transform everyday scenes into magical, gamelike experiences: You could animate your picnic with a floating rainbow, or watch a reindeer dancing on your coffee table. Looking through the Snapchat camera lens, you could bring cartoonish 3-D objects to coexist with the kids or the dog, like an episode of Blues Clues.

Snapchat didn't invent the wheel here; developers have been building the mixed reality future for years. But for the most part, that future has looked at best mysterious, even dubious, to people outside of the tech space. Just look at the popular failure of Google Glass. "The idea that people had a computer on their face, which had a camera in it—there was this massive cultural social backlash against it," says John Hanke, the CEO of AR software company Niantic. "I think it was in part because people really didn't understand what augmented reality was going to do for them."

Last year, when Niantic introduced Pokémon Go, Hanke says he saw a major shift. The technology wasn't perfect, but because people could use their phones—not a clunky headset that cost hundreds of dollars, but the device that's always in your pocket—they warmed up to AR a lot more quickly. And because Go is a game, not an application that's meant to scoop up data and information with its camera, people were much more willing to try it.