Apple’s latest operating system may still be chugging along through the phase of developer betas, but we’ve already been seeing a lot of the projects developers have been building for the company’s ARKit augmented reality platform for iOS.

The set of developer tools carries out the heavy-lifting of location mapping, allowing creators to focus their energies on what makes the most sense when you set out mixing the physical and digital worlds.

For now, devs are just beginning to experiment with the new medium, and it’s clear there’s a lot more to come from Apple’s early foray into the augmented reality space. Click through some of these early highlights found on the Twitter account @MadeWithARKit.

While a good deal of these align with demos that have been put on things like Microsoft’s HoloLens, the promise that soon these capabilities will be on millions of iOS devices gives developers a much larger audience (and incentive) to begin experimenting. Add in the rumors that Apple’s next generation iPhones will feature enhanced depth-sensing AR-focused camera sensor modules and things get even more interesting for the platform.

It’s worth noting that most of these apps are still in the gimmicky “wow, how neat!” sort of phase. It’s the same as when you downloaded the Zippo lighter app for your iPhone and showed it to all your friends. Where the use cases go beyond these will further define where augmented reality moves on Apple’s mobile platforms. It is clear that Apple has built a very technically sophisticated system for phone-based SLAM (simultaneous localization and mapping) and that its ecosystem may have a much easier time courting developers than AR platforms from Snap, Facebook and Google.