Mobile AR

AR apps are super fun to play with, and they're quite fun to build too. You may have heard about ARKit or ARCore, SDKs provided by Apple and Google to create AR experiences on iOS/Android devices. These SDKs are opening up new horizons for developers by exposing high-level APIs to create AR applications. These APIs make tasks like plane detection (horizontal and vertical), 6-DOF motion tracking, facial recognition, and 3D rendering a lot more manageable. As these SDKs are advancing quite rapidly in parallel with computing power, these platforms will continue to roll out more advanced features for developers to start integrating right away.

Many AR experiences can be enhanced by using known features of the user's environment to trigger the appearance of virtual content, instead of letting it float about around the user. In iOS 11.3 and later, we can now add such features by enabling 2D image tracking in ARKit. We can provide these images as assets for our apps, and use them as references during tracking. When ARKit detects these images in the real world, we can then use that event to anchor 3D (or 2D) assets to the AR world.

Today, I'll be guiding you through creating your own AR app. We'll use ARKit's image tracking feature to look for a 2D image in the real world, and anchor 3D content to it. I'm using a postcard of an elephant, but feel free to use any 2D image you want. I encourage you to print the image out, but you could technically track the image on your screen as well.

Here's what your final result should look like (you'll be using your own reference image for tracking of course). This beautiful postcard is courtesy of Brad Wilson.