Snapchat is an app known for its 10-second ephemeral videos—but Bobby Murphy, one of the original founders and current CTO of the app’s parent company, Snap, takes the long view. On stage at the Fast Company Innovation Festival, Murphy described thinking 10 to 20 years into Snap’s future: With features such as Landmarkers, which uses your phone’s rear-facing camera to apply lenses to world landmarks (you could cover New York’s Flatiron building with pizza, for instance), and an increased focus on practical augmented reality (think of using your camera to scan products for more information), Snapchat is looking to become the lens through which we see the world.

“If you think about the way that experiences are designed today, they’re in a 2D space. When you design an AR experience,” Murphy said, “you’re thinking in a way that removes the concept of the screen altogether.”

Snapchat has plenty of AR-based lenses today, but as Murphy describes it, the company aims to make Snapchat’s use of the technology much broader and more practical—beyond just fun selfie lenses. To do that, Snap needs to evolve what Murphy calls its “lens ecosystem.”

For those not immediately familiar with how Snapchat functions, a “lens” is the term the company uses to describe the filters that can be layered over a live video to create an augmented experience. Up until now, this has been done with the user-facing camera, to apply fun filters to selfies and videos of people’s faces. Sure, you look cute now, but what if you had floppy dog ears? As most millennials and Gen-Zers will know, there’s a lens for that.

But as the company looks to what’s next, the Snapchat lens-design teams and AR teams are closely collaborating to explore broader use cases and functionalities that go beyond creative expression, and to redesign the app itself to create space for those operations. One example is the introduction of “utility” lenses, which use Snapchat’s “Scan” functionality to offer useful features to users as they look at the world around them through their phones. For instance, a feature called “Photomath” can scan a math problem and offer a solution. Another lens introduced last year lets users scan physical objects to purchase them on Amazon. These lenses are designed as opportunities for AR to act as a personal assistant.

The app’s main lens carousel is currently geared toward play and personal photos, “but it’s not conducive to opening a utility lens,” says Murphy. He envisions positioning Scan as a home for functionality such as product search, Photomath, and AR-based experiences—an answer to a world where we compute in 3D. Meanwhile, the lens-explore interface will become a hub of creative lens types and experiences.

Building out this new hub for utility lenses won’t be without its challenges. Take Snapchat’s Landmarkers lens, which uses AR to recognize landmarks and lets users alter them onscreen. Creating a world-facing lens requires a data set that’s consistent with a person’s viewpoint of a landmark, so using photos from, say, a Google search wouldn’t be specific enough to build from. Instead, the Landmarkers function has been constructed over thousands of public snaps so far. And unlike human faces, which a computer can easily read as similar, the variety of architectural and structural forms a camera captures in the real world is endless.