New Augmented Reality App makes largest Swiss Train Station fit for the future

A week ago the Swiss Federal Railways released an Augmented Reality App that gives us a preview of how we will navigate train stations in the future. The best part? That future starts now.

The first thing that I noticed when I arrived this morning at the Zürich Hauptbahnhof was how cold it felt. It was like freezing minus 10 degrees — which might or might not be a slight exaggeration from my side.

I pulled out my Android Phone and installed the SBB AR App from the Android store. Since I was invited to a workbench talk with the devs of the this Augmented Reality App I wanted to first get some hands-on experience. If you happen to be in Zürich and have a later Android Model with ARCore, you can try out this app for yourself. I want to point out that the App is still an early preview version and, as the Product Owner Aline Dietrich would later point out to me, is just the very first step on a journey that will take several years.

First Impressions

My first impression of this new AR experience was — nothing. Interactions in Augmented Reality happen on physical objects, and I had no such interaction point in proximity. So I was left with a screen showing my camera feed and a message to scan my surroundings. I did so by starting to walk around holding my phone up and after a few steps my screen turned black with a note telling me “that for security reasons AR only works when standing still”. While this may sound like a terrible user experience, it goes to shows the difficulties the team is facing with this new technology. Walking around a busy train station with thousands of other travelers to bump into, stairs to tumble down, benches to trip over and — worst of all — platform edges to fall off and get yourself killed security is crucial.

A couple meters later I found my first intractable object, pointed my phone towards it and got real time live public transport data displayed right on my display. That. Is. Impressive. Just think of how many systems have to work in tune to make this possible!

How it works

Tracking (and navigating) in the open using GPS and other methods has come a long way and is pretty standard today. But tracking inside a building is another beast. There can be little to no GPS reception crippling the accuracy of the phones built-in tracking to above 20 meters in any direction. And then there are four levels in that particular station, meaning the tracking has to work vertically too. You see, relying on the phone tracking to pinpoint exactly on which platform or in front of which display a user stands is currently impossible. Beacons would be a possible technical solution, but installing beacons in every single train station in Switzerland is not feasible yet.

Instead the SBB team used machine learning to train an image recognition algorithm on important objects that occur in a train station, for example these timetable displays. This solution is way more robust and could potentially be rolled out everywhere with a reasonable effort (although they currently stick to their test market in Zurich). From the recognized image they then extract the platform number. With that number and the approximate position of the phone they can then grab the live public transport data for the exact platform on that particular train station (or tram or bus stop). And finally that data gets displayed in Augmented Reality on my device using I don’t know how many sensors to place that virtual pane below a physical object.

I’m pretty up-to-date on these technologies so I knew what to expect, but it still amazes me over and over again to see how far Augmented Reality has come in recent years. And my anticipation rises every time I can play around with an actual new use case knowing that I can be part of the first generation of UX designers that are able to discover this beautiful, overwhelming, challenging new dimension.

UX Chances and Challenges

Speaking of UX challenges, there are several of them they had to face. Some are very basic: today, we are used to interact with our screens by tapping on artificial objects. Mixed Reality on the other hand wants to steer users to interact with actual objects in the real world. This is a change in user behavior that will take some while, but eventually take place. In these early days, on-boarding new users is a challenge.

Another issue they faced is a common one on the phone: the screen estate is pretty small and too much data is overwhelming. Especially with the amount of actually relevant data they could have displayed, it took several iterations to find a level that was concise enough to be usable while still maintaining a useful information density.

It’s not easy building a augmented reality app these days. There are many technical challenges to solve and not many UX patterns have emerged. But that, is what makes it so fun! Beeing a UX Designer myself, I’ve been in the same place with our HoloLens mixed reality apps. Discovering all these nitty-gritty details, ditching ideas and finding new ones is one of the most fun and interesting things I’ve ever worked on.

And tomorrow?

While the team didn’t provide us with detailed road map for the future (which I ask for but didn’t expect to get 🕵️) they made clear that pure AR use cases aren’t their primary goal. What they are aiming for is to integrate AR naturally into use cases where it makes sense and enhances the user experience.

Two examples that came to my mind are:

For someone with a mild to medium visual impairment (they can still recognize their surroundings but can’t read the screens) it would be tremendously valuable to point their phone at a screen and have the screen reader inform them about it.

Or if I’ve entered my desired train route in the default app and have to change at a station I don’t know, I could switch from 2D to AR to get quick visual guidance where my connecting train is.

They told me with a smile, that they have a couple interesting ideas in their backlog. Today the limiting factors are both the available technology and the underlying platforms and services needed to build those ideas. But it’s easy to see how advances in technology will change both our interactions and the products we’ll see fundamentally. Look at AR on our phones, the augmentation still seems a little bit detached from reality.

Detached Augmented / Reality

But let’s assume we finally get good looking and functional glasses by mid-July 2020 (just an utterly wild guess, but if it turns out I was right, remember where you’ve read it first! :] )

These two worlds suddenly start to fuse together and you can see how limiting the phone’s display really is.