Background

Recently, Perficient Digital Labs has been busy working with emerging technologies like augmented reality. As part of those efforts, we began to experiment with new ARKit features available in iOS 11 & 12. We ended up with a loose proof of concept that allowed us to identify Chicago’s bus signs using ARKit’s image recognition feature, and load a mock interface to display associated bus information.

Proof of Concept of detecting a Bus sign with ARKit’s Image Recognition

The proof of concept was strong and the technology worked extremely well, even in poor conditions. However, the use case of providing real-time, contextual information for public signage was too compelling to abandon. We knew we could easily provide some real value to users by interfacing with the CTA API. We needed to take it further…

Let's make an AR bus stop app

The concept was there, but what would a full-blown AR-powered bus tracking app look like? What would the user flow be? What traditional navigation features could be improved by utilizing augmented reality? What features are crucial to a first release? Thankfully we had an entire team of designers backing us up on this one.

Starting from the ground up, our design team iterated over a variety of features, flows, and designs. Focused on the Chicago CTA bus system, the new CTA-AR app would have features found in traditional bus tracking applications, while intwining the new AR flow and motion-based improvements.