Building Context-Aware Notifications

Why we think location—as well as motion detection—matters when it comes to sending great news notifications, and how we used Apple’s CoreMotion framework to bring them to life.

Image by Dave 77459 used under CC BY

How relevant or useful a news notification is to someone depends a lot on the content — are you interested in this political outcome, game result or breaking local news? It’s usefulness also depends on the action it lets you take — can you read an article, share an infographic, watch a live stream or set a reminder? In addition to the content and utility of an alert, there was another factor we thought was getting overlooked to make a news notification relevant or useful: the user’s current location.

Why ‘current location’ matters to notifications

One way to think about how someone’s location matters when receiving an alert is to consider these two examples: You’re driving and you get a notification about playing the next level of Fruit Ninja, or you’re walking in the park on a Sunday afternoon and you get a notification about a cheap gas station nearby. These notifications probably aren’t relevant or useful to the person receiving them. We wondered if a well-timed, location-aware news notification could be a better experience, and that’s why we built HERE for Local Journalism, an app that sends local news notifications as people approach the place a story has been written about.

Why current location alone wasn’t enough

However after building our first prototype, we realized that if we triggered notifications based on location data alone, that might be a bad experience. For example, one of our internal beta testers reported receiving “maybe ten” notifications while in a cab going up a city street in Philadelphia. She couldn’t read the stories about the places she was passing by, and it wasn’t the ideal experience.

Keying in on that feedback, our UX designer Faye Teng helped us think through other situations someone might be in when they received a location-aware alert: How would the app work if they’re biking or driving? What happens if people are in a rush? What does the app do if someone was on public transportation, or if someone was in a city versus a suburb?

These questions forced us to think about whether people should receive alerts while driving, but we still weren’t sure what do. We thought about sending the alerts silently or not sending them at all to reduce distraction and increase relevance. Either way, we knew that our application would need to be able to tell if someone was moving so it could react accordingly.

Was there a way to solve this problem?

Apple’s CoreMotion framework gives developers access to the data generated by a devices’ various sensors: accelerometers, gyroscopes, pedometer, magnetometer, and barometer. This is the data we would need to predict a user’s movement, so we started researching how to implement the framework. What we didn’t know at the time was exactly how much information the framework would give us about someone’s activity, or how tricky it would be to implement it in a way that made sense.

Luckily with the introduction of the extremely energy efficient M7 chip and the CMMotionActivityManager in 2014, Apple made it considerably easier to predict what an app user is currently doing. Here are the categories provided by the CMMotionActivityManager:

Walking

Automotive

Stationary

Cycling

Unknown

Running

It also provides you with a confidence rating for each prediction, ranging from high to low. It can also sometimes report a user in two states at once — i.e. “Automotive, high confidence” and “Stationary, low confidence”.