Certain stories in the Our Stories section are already using Crowd Surf, like the featured video of Lorde's performance at last weekend's Outside Lands. As advertised, a button in the bottom-right corner lets you flip between different users' perspectives while the audio remains playing, and it works well. Obviously, coverage will be limited by how many users are snapping the same event (and at what quality), but get enough people watching and enough are bound to pull out their phones to start snapping.

Snapchat built its own proprietary machine learning tech to automatically recognize the audio among user snaps in order to stitch it into Crowd Surf videos, according to Mashable. The feature will be available for select events; We've reached out to Snapchat to elaborate which will get the seamless-video treatment.

While it's unclear how widespread the feature will become, it's yet another dynamic addition to Snapchat's lineup, like custom Stories, that sets it apart from Facebook and Instagram. And it's clear that the company wants to maintain its lead without competitors copying its advances: Last month, it acquired a team that specifically protects code from reverse engineering.