I coded a conceptual augmented reality app for iPhone called sneakAR (it’s really just “sneaker”). It’s the culmination of my newfound obsession with AR. I’ve been making iOS apps on the side for a couple of years now but it was ARKit that really opened my eyes to the potential of augmented reality. I got the idea from an article in a technology magazine talking about practical uses for AR. The crux of the app’s concept is this: what if there was a way for users to visually inspect sneakers and even interact with them, albeit simplistically, in AR before they made their purchase?

This is how the app works. ARKit first needs a brief moment to detect worldly “feature points”. Once it’s ready, users tap on a surface to place an AR sneaker on it. Users can then change the sizing of the shoes and select other sneakers via the scrollbar. They can also cycle through a shoe’s colorways by tapping on the shoe itself, provided the shoe has other colors. By planting it on a surface, users can compare a virtual shoe with other shoes, see if it works with their wardrobe, etc. Once they’ve found a shoe they’d like to buy, tapping and holding on to the sneaker will take users to an online store for purchase.

How to use sneakAR

This obviously wouldn’t work without realistic-looking assets. For the sneakers themselves, I either found some 3D models online or used a technique called photogrammetry with my own collection. I avoided perfectly modelled shoes from asset stores and opted for as much a worn-in look as possible. After lots of trial and error (the process is imprecise and finicky) I got some realistic-looking sneakers rendered in 3D. I fixed the sneakers’ meshes with Blender, cleaned/made new textures in Photoshop and imported the shoes into Xcode. I coded the app itself purely in Swift, but I would love to try remaking it in Unity in the future in order to make an Android counterpart.

Testing sneakAR beside the shoe I captured using photogrammetry. As the technology gets better, I believe the comparison will become indistinguishable.

Here’s sneakAR compared to the physical sneakers

The next part is a brief technical report so feel free to skip it.

-Textures are loaded from a Firebase server. Essentially, you can have an unlimited amount of textures because you’re not limited by your phone’s storage, and colorways are automatically updated as they’re released. This is super useful for shoes that are constantly getting new colorways (e.g. Air Jordan 1s, Yeezy Boost 350s). Also, if I split textures based on components, like shoelaces and soles, there’s a potential for users to customize their own footwear.

-Models, however, are stored within the app. This makes the app heavier in size. The ideal is to “stream” 3D models into the user’s phone the way Netflix streams content in a user-friendly manner. As an experiment I had the app download temporarily hosted models from a server successfully, but found that some models took too long to download (even with a visual indicator it was a bad user experience). I have to find a way to balance decreasing the app size without hindering UX, like reducing model complexity, downloading models in the background, etc.

-ARKit only accepts .dae (Collada) and .scn files for 3D models. All models had to be converted as such before importing into Xcode. For the model downloading experiment I had to convert them all to .scn files.

-I might incorporate a way to rotate, transform and even flip shoes with gestures. Currently there’s no way I know of in ARKit to mirror 3D models programmatically and even then the ruse would instantly be over when users see a backwards logo (the solution is to scan the other shoe but if I did this for every shoe the app is now twice its original size).

-I’m a big analytics fan. It could be useful gathering demographic information like most purchased size, most popular colorway/model, etc. for shoe brands. The extra element of interactivity with this kind of app certainly lends itself to gathering useful data.

-In the future there would be a dedicated API handling everything; the app itself would essentially just be a vessel for holding user preferences (e.g. foot size, favorite colors) and a UI for interacting with a server — no shoe models or textures stored in the app to make it as lightweight as possible.

-Next step is to train a neural network to segment feet so that users can “wear” the sneakers!

That’s all for my very first Medium story! If you have any cool ideas (AR or not) or just want to talk shop, you can find me on Twitter or Instagram @rtql8d (“articulated” 😉) or email me at jan.b.crisologo@gmail.com.