Hey guys, it’s me, António again. 👋 Last time we interacted I was writing about how we used an external sensor to turn iPhones into breathing rate monitors for Brythm. This time however, I’m going to tell you all about the amazing world of Augmented Reality (or AR for short).

This sudden thematic shift didn’t come out of nowhere but rather from Apple’s WWDC 2018 event, where they showcased many projects they are working on, how far they’ve come, and how there is still so much to improve upon. Among their many keynotes, one of them touched on their latest AR endeavor: ARKit 2. This in turn peaked my interest in the subject and, since I had some free time on my hands and wanted a new pet project, well… here we are 😅 so without further ado, let’s dive right in.

Definition of AR

What is Augmented Reality after all? Plenty of people mistake it for Virtual Reality but I hope that by the end of this post you’ll be able to spot their differences as clearly as night 🌙 🌃 and day ☀️ 🏙.

As you might have noticed, your physical body and mind exist in a mix of space and time which you perceive as Reality, and nowadays you can even “exist” inside a computer-generated mix of space and time known as Virtual Reality. However, in between these two extremes there can exist other types of reality where “normal” Reality and Virtual Reality intertwine to varying degrees, one of which being Augmented Reality.

Types of reality. No, really…

As we can see, Augmented Reality as a concept implies Reality with a bit of Virtual Reality added for extra flavour, but not so much as to overwhelm it, or put into another perspective, the emphasis should be on bringing virtual elements into the physical world rather than the other way around. This all seems fine as a concept, but it brings us to the following question (or any less esoteric version of it):

What properties and rules should an Augmented Reality experience abide by?

As it turns out, there are three key components to this:

1) it must combine virtual and real information, with the real world as the primary place of action;

it must combine virtual and real information, with the real world as the primary place of action; 2) it must be interactive with real-time updates;

it must be interactive with real-time updates; 3) it must have virtual information registered in 3D space, in the physical environment.

In essence, while moving in a physical environment, the user’s actions (movement/interaction) have direct consequences on the rendered virtual elements, and all of this happens in real-time. If you pay close attention, these rules do not discriminate a specific output device or interaction medium, meaning that, in theory, AR is not limited to displays and visual stimuli. 🤔However, audio, haptic, olfactory and gustatory AR is more difficult to achieve.