It’s fair to say that when Amazon introduced the first Echo speaker in the fall of 2014, most people weren’t quite sure what to make of it. In the intervening years, Echo and the broader universe of Alexa-powered devices have transitioned from curiosity to ubiquity. But while you can find Alexa in just about everything—including, yes, a microwave—the real progress Amazon’s voice assistant made in 2018 came less from breadth than from depth.

That’s not to say it hasn’t made gains of scale. Amazon’s voice assistant has doubled the number of countries where it’s available, for starters, learning how to speak French and Spanish along the way. More than 28,000 smart home devices work with Alexa now, six times as many as at the beginning of the year. And more than 100 distinct products have Alexa built in. If you’re looking for some sort of tipping point, consider that, as of last month, you can buy an Alexa-compatible Big Mouth Billy Bass.

It’s how Alexa evolves under the hood, though, that has defined this year—and how it will continue to inch toward its full potential in those to come. Alexa has gotten smarter, in ways so subtle you might not yet have even noticed.

Machine Head

Because many voice assistant improvements aim to reduce friction, they’re almost invisible by design. Over the past year, Alexa has learned how to carry over context from one query to the next, and to register follow-up questions without having to repeat the wake word. You can ask Alexa to do more than one thing in the same request, and summon a skill—Alexa’s version of apps—without having to know its exact name.

So-called active learning, in which the system identifies areas in which it needs help from a human expert, has helped substantially cut down on Alexa’s error rates.

Those may sound like small tweaks, but cumulatively they represent major progress toward a more conversational voice assistant, one that solves problems rather than introducing new frustrations. You can talk to Alexa in a far more natural way than you could a year ago, with a reasonable expectation that it will understand what you’re saying.

Those gains have come, unsurprisingly, through the continued introduction and refinement of machine learning techniques. So-called active learning, in which the system identifies areas in which it needs help from a human expert, has helped substantially cut down on Alexa’s error rates. “That’s fed into every part of our pipeline, including speech recognition and natural language understanding,” says Rohit Prasad, vice president and chief scientist of Amazon Alexa. “That makes all of our machine learning models look better.”

More recently, Amazon introduced what’s known as transfer learning to Alexa. Prasad gives the example of trying to build a recipe skill from scratch—which anyone can do, thanks to Amazon’s recently introduced skills “blueprints”. Developers could potentially harness everything Alexa knows about restaurants, say, or grocery items to help cut down on the grunt work they’d otherwise face. "Essentially, with deep learning we’re able to model a large number of domains and transfer that learning to a new domain or skill,” Prasad says.

The benefits of the machine learning improvements manifest themselves across all aspects of Alexa, but the simplest argument for its impact is that the system has seen a 25 percent reduction in its error rate over the last year. That’s a significant number of headaches Echo owners no longer have to deal with.

And more advances are incoming. Just this month, Alexa launched self-learning, which lets the system automatically make corrections based on context clues. Prasad again provides an example: Say you ask your Echo to “play XM Chill,” and the request fails because Alexa doesn’t catalogue the station that way. If you follow up by saying “play Sirius channel 53,” and continuing listening, Alexa will learn that XM Chill and Sirius channel 53 are the same, all on its own. “That’s a big deal for AI systems,” says Prasad. “This is where it’s learning from implicit feedback.”

Want more? Read all of WIRED’s year-end coverage

The next frontier, though, gets a little trickier. Amazon wants Alexa to get smarter, obviously, and better at anticipating your needs at any given time. It also, though, wants Alexa to better understand not just what you’re saying but how you say it.