Damian Mehers has his morning routine down. Most days, the senior software engineer for Evernote gets up, opens the app, searches for the list of items his children need for the day (books, gym clothes, lunch, for example), and then sends the kiddos on their way. It’s a fairly streamlined process, though if you ask Mehers, it could stand some improvement. “It would be nice if Evernote just popped that information up for me in an unobtrusive way,” he says.

What Mehers means is, it would be nice if Evernote knew enough about him and his habits to anticipate what he wanted before he had to ask for it. In this scenario, instead of waiting for the designer to search out the list himself, the app would notice he’s at home, take stock of the time and make an educated assumption about what note he might like to see.

Mehers’ vision isn’t far from being a reality. As Evernote moves into the wearables market, the app’s developers are placing more importance on gleaning meaning from its millions of data points to serve up useful, contextualized information. The company just released its app for Android Wear, and its big goal is to build a handy personal assistant who relieves you from mundane tasks like rifling through your pile of digital notes for information.

To do this though, the app has to learn much more about who we are, what we like, and what we do. As Mehers puts it, and as WIRED has said before: With wearables, context is key.

Evernote

Beyond the Phone

For now, Evernote for Android is mostly an extension of the phone’s capabilities. The app downloads directly to your smartwatch and pulls in data from your phone. It can handle simple tasks like taking notes, searching for notes and displaying checklists. There’s a handy feature that pushes recently viewed notes on your phone to your smartwatch; so if you were looking at shopping list on your phone, within five minutes that same note would pop up on your watch (they’re looking to get that transition time down to 30 seconds). “There’s a good chance that note will be relevant,” explains Mehers.

The app is also able to add context to recurring meetings. When you have a calendar event scheduled in 10 minutes, Evernote will search your notes to see if you took or modified a note last time that event occurred. If so, you’ll get a little reminder of what you took down. Similarly, the app will recall if you’ve taken a note at a specific location. So say you’re visiting Paris and you happen to walk past the bakery you jotted down during your trip research. Evernote will send you a notification alerting you.

These are thoughtful, if simple, functionalities that have the express purpose of picking up the slack where our human minds fail. “It’s ok not to remember everything,” says Zeesha Currimbhoy, vice president of Evernote’s augmented intelligence team. “We want to say, let us do the work for you, let us figure what you need to know when you need to know it and surface that info for you at that time.”

Evernote

Living in Fear of Clippy

Evernote seems particularly well positioned to take advantage of what wearables have to offer. Like Google, the app is a repository of information. You store events, meetings and random thoughts on everything from the sandwich shop you just visited to your brilliant business idea.

Even with all that data, it’s a pretty big leap for an app to make assumptions about what’s relevant to each of its 100 million users. “I live in terror of us becoming Clippy,” says Mehers, referring to Microsoft Office’s eager, off-base, personal assistant. Knowing what’s important and when it’s important is harder than you might think, particularly because Evernote doesn’t look at its data as a sweeping set of information. “We could very well say, let’s take all the data and put it into one space but it's more of a privacy concern,” she says. “We think of it in terms of 100 million much smaller data problems.”

Photo: Ariel Zambelich/WIRED Ariel Zambelich/WIRED

It's Currimbhoy’s job to figure out how to make sense of such disparate sets of data. She leads a team of eight engineers and designers who are constantly working to find algorithms that work for a user who has 10 notes and a user with a thousand notes. “From a technology standpoint, figuring out semantics in data and balancing that with context is definitely where we think we’re going to be heading,” she says. “That’s what’s really going to change things for wearables.”

For now, Evernote (and many apps new to smartwatches), is in learning mode. Every action a user takes is a chance for the app to improve on itself. So say a notification about an upcoming meeting is pushed to your watch. Uninterested, you swipe it away. Next week at the same time, it pops up again, and you swipe it away once more. “Evernote should be able to say hey, this user is not interested, I wont show it to them again,” says Currimbhoy. Getting to that point is going to take some trial and error. There might be a lot of swiping before apps finally land on the right algorithmic elixir to propel smartwatches beyond early adopter playthings. As Currimbhoy puts it: “Being able to take feedback and scale your experience accordingly is very important—we want the user to feel like, 'Ok, this is an app that truly understands me.'”