BY LILLY SMITH 3 MINUTE READ

Snapchat is an app known for its 10-second ephemeral videos—but Bobby Murphy, one of the original founders and current CTO of the app’s parent company, Snap, takes the long view. On stage at the Fast Company Innovation Festival, Murphy described thinking 10 to 20 years into Snap’s future: With features such as Landmarkers, which uses your phone’s rear-facing camera to apply lenses to world landmarks (you could cover New York’s Flatiron building with pizza, for instance), and an increased focus on practical augmented reality (think of using your camera to scan products for more information), Snapchat is looking to become the lens through which we see the world.

“If you think about the way that experiences are designed today, they’re in a 2D space. When you design an AR experience,” Murphy said, “you’re thinking in a way that removes the concept of the screen altogether.”

Snapchat has plenty of AR-based lenses today, but as Murphy describes it, the company aims to make Snapchat’s use of the technology much broader and more practical—beyond just fun selfie lenses. To do that, Snap needs to evolve what Murphy calls its “lens ecosystem.”

For those not immediately familiar with how Snapchat functions, a “lens” is the term the company uses to describe the filters that can be layered over a live video to create an augmented experience. Up until now, this has been done with the user-facing camera, to apply fun filters to selfies and videos of people’s faces. Sure, you look cute now, but what if you had floppy dog ears? As most millennials and Gen-Zers will know, there’s a lens for that.

But as the company looks to what’s next, the Snapchat lens-design teams and AR teams are closely collaborating to explore broader use cases and functionalities that go beyond creative expression, and to redesign the app itself to create space for those operations. One example is the introduction of “utility” lenses, which use Snapchat’s “Scan” functionality to offer useful features to users as they look at the world around them through their phones. For instance, a feature called “Photomath” can scan a math problem and offer a solution. Another lens introduced last year lets users scan physical objects to purchase them on Amazon. These lenses are designed as opportunities for AR to act as a personal assistant.

The app’s main lens carousel is currently geared toward play and personal photos, “but it’s not conducive to opening a utility lens,” says Murphy. He envisions positioning Scan as a home for functionality such as product search, Photomath, and AR-based experiences—an answer to a world where we compute in 3D. Meanwhile, the lens-explore interface will become a hub of creative lens types and experiences.

Building out this new hub for utility lenses won’t be without its challenges. Take Snapchat’s Landmarkers lens, which uses AR to recognize landmarks and lets users alter them onscreen. Creating a world-facing lens requires a data set that’s consistent with a person’s viewpoint of a landmark, so using photos from, say, a Google search wouldn’t be specific enough to build from. Instead, the Landmarkers function has been constructed over thousands of public snaps so far. And unlike human faces, which a computer can easily read as similar, the variety of architectural and structural forms a camera captures in the real world is endless.

But Murphy doesn’t seem to see those challenges as roadblocks—part of the benefit of taking the long view. For instance, he mentions that the Landmarkers lens has been around for some time, and while engagement with the feature was low because it was initially hard to find, conversion rates for the function were high. “Even if something doesn’t immediately resonate, it could be foundational for future uses of AR. It all adds to the collective value.”

It will be important for Snapchat to communicate new value to its 210 million users and more if the company wants to continue its upward climb—and realize its aspirations of becoming an integral part of how we interface with the world. Fast Company senior writer Mark Wilson asked Murphy if Snap wants to become the OS of reality. His reply? “Maybe.”


Article originally appeared on fastcompany.com