While Snapchat is no stranger to location-based AR scavenger hunts, the app's new world-facing game adds some environmental understanding to the mix.

This week, Snapchat launched a new Snappables game dubbed Scavenger Hunt. However, while most Snappables games use the front-facing camera to insert users and their friends into the game, Scavenger Hunt flips gameplay to the rear camera.

The object of the game is for users to find eight real-world objects — a bottle, car, plate, cup, smartphone, TV, watch, and chair — within their surroundings using their smartphone's camera. As a reward for finding each object, the app personifies each object in view with an animated AR smiley face and arms. Players who find all eight object types unlock a special "Heroic Hunter" selfie Lens.

Images by Tommy Palladino/Next Reality

The game works using real-time object recognition functionality that can identify everyday objects, such as cups, mobile devices, and automobiles. Once the target object is detected in the camera view, the computer vision model estimates the size and position of the object, which enables the app to anchor AR content onto the object.

In terms of real-world performance, the object recognition can be lightning-quick and highly accurate. During our testing, in most cases, the app identified the object as soon as the target object was in view. Perhaps most impressive was the car recognition, which first was able to spot a parked car behind some bushes and then started tracking a moving car as it was passing by!

Images by Tommy Palladino/Next Reality

In other cases, the cup and phone took a bit more work to find the right angle, while a smartwatch with a fairly traditional shape and watch face did not pass the test for being recognized as a watch. However, it was here that I discovered that a mere picture of a watch was enough to satisfy the objective.

Snap's computer vision engineering capabilities are at the root of its augmented reality technology, dating all the way back to the face recognition in its original Lenses tool for the front-facing camera.

(1) I encountered some difficulty in finding the phone while docked. (2) But, once it was hand held, all was rad. Images by Tommy Palladino/Next Reality

In recent more years, though, Snap's computer vision ambitions have focused more on environmental understanding, adding the interactive capability to images and objects in the real world. In 2018, Snap introduced Visual Search, which allowed users to recognize objects and retrieve product information from Amazon inventory. Last year, the company renamed the feature as "Scan" as part of its redesigned AR Bar as its image recognition and landmark recognition capabilities have been a hit with advertisers like Nike and HBO.

(1) This smartwatch apparently doesn't look like a watch. (2) This picture of a watch, however, scores the win!, (3) I was rewarded with the title of Heroic Hunter. And I didn't even have to go outside. Images by Tommy Palladino/Next Reality

While its front-facing Lenses supply fun for the selfie era, the world-facing capabilities are what set the stage for Snap's eventual foray into smartglasses, which unofficially began with the 3D camera array in Spectacles 3.

With the ability to recognize images, landmarks, and object types under its belt, Snap now has a good foundation for software that can supply environmental understanding for smartglasses that, further down the road, can actually replace smartphones for everyday computing.