A number of news outlets have published stories in recent days expressing privacy concerns about Amazon's Alexa device.

But Alexa isn't really eavesdropping on users, argues VoiceBrew's Katherine Prescott.

Here is what is really going on.

Earlier this week, Bloomberg published an article “Amazon Workers Are Listening to What You Tell Alexa.” A pile-on from other news outlets followed. “Thousands of Amazon employees hear what you say to Alexa” and “Amazon workers eavesdrop on your talks with Alexa” are two of the more sensational titles that emerged.

But what’s missing is the other side of the argument explaining why this isn’t actually such a big deal — or particularly surprising that Amazon is using human input to help improve Alexa.

Here are the key points I believe need to be part of the discussion:

#1. Alexa is always listening for her wake word and is not always recording what you are saying

Many people skimming through these articles (or just reading the titles) may be left with the impression that Alexa is constantly recording or randomly eavesdropping on them. This is simply not true.

First off, Alexa only records after hearing the wake word “Alexa” (unless you changed it to a custom Alexa wake word). In case you’re wondering why Alexa needs to record your Alexa requests at all, it’s because Alexa needs to send those recordings to Amazon’s cloud to process in order to give you a response.

Second, you’ll know when Alexa is recording because the blue ring on your Echo will light up. Alexa was designed to show you when she is recording.

And if you inadvertently wake Alexa up (this happens all the time) and don’t notice the blue ring, Alexa will probably chime in with some irrelevant piece of information anyway. So at least you’ll know she was recording and processing what you were just saying. (And you can always easily delete your Alexa history of voice recordings.)

#2. It makes sense to use humans to help train virtual assistants to better understand humans

To quote Ben Thompson in this morning’s Stratechery, “How else can a speech-to-text algorithm, which is at the core of Alexa’s service, improve if not by leveraging entities that have organic highly developed speech-to-text capabilities, that is to say, humans?”

That’s why Google and Apple also use human helpers to train their assistants, Google Assistant and Siri.

Workers listen to recordings and annotate them as part of Amazon’s efforts to help Alexa better understand people’s requests. There is nothing nefarious here.

#3. Many of these articles say that Amazon workers are listening to what you tell Alexa, but Amazon workers annotating recordings don’t know who said them

Importantly, Amazon workers don’t need to know who said them in order to do their jobs -- which involve determining whether Alexa’s interpretation of someone’s recording lines up with what the person said. It doesn’t matter who said it.

These Amazon workers don’t even have access to information that would allow them to match a recording to a specific person. This key piece of information from Amazon gets buried midway through the Bloomberg article (emphasis my own):

“We have strict technical and operational safeguards, and have a zero tolerance policy for the abuse of our system,” an Amazon spokesperson said. “Employees do not have direct access to information that can identify the person or account as part of this workflow. All information is treated with high confidentiality and we use multi-factor authentication to restrict access, service encryption and audits of our control environment to protect it.”

#4. People do not need to worry they might say something sensitive or embarrassing in their home and that an Amazon worker will match that statement back to them

This is the scenario that people worry about, but it’s virtually impossible (and almost infinitely improbable). That’s because the conversations that Amazon workers transcribe and annotate cannot be matched back to individual users by those workers as a general matter of workflow.

Here are all the things that would have to happen in order for this scenario to actually happen:

Alexa mistakenly picks up something that you said but didn’t intend Alexa to hear (which means you probably said something that sounded like “Alexa” to wake her up and record you)

The statement that was mistakenly recorded by Alexa was sensitive or embarrassing

The recording ends up as one of the very few selected to be annotated by Amazon workers (according to Amazon, only “an extremely small sample of Alexa voice recordings” are annotated)

An Amazon worker annotating the recording decides to spend time trying to figure out who said the recording (even this seems like a pretty unlikely thing to happen)

The Amazon worker does not have access to that information, and it’s a dead end

From my standpoint as an Alexa user, there is basically zero risk that I might say something private that is captured by Alexa and then matched back to me.

#5. If you want, you can easily update your privacy settings to make sure that no Amazon worker ever hears your voice recordings to help improve Alexa

It will take you less than 30 seconds, and here’s how to do it:

Open the Alexa app Tap the hamburger (three horizontal lines) icon in upper left corner Tap Settings Tap Alexa Account Tap Alexa Privacy Tap Manage How Your Data Improves Alexa Set the toggles to the off setting

You can even delete your Alexa history of voice recordings permanently in a few easy steps -- even going all the way back to when you first got Alexa.

Here’s a guide to the 5 Alexa privacy tips that all Alexa users should know. Most people don’t know how easy Amazon makes it to have complete control over what level of privacy you want as an Alexa user.

Katherine Prescott is the Founder of VoiceBrew, a free weekly email that helps you get the most out of Alexa.