The contents of the report aren't entirely surprising. Back in January, it was revealed that Amazon-owned Ring gave a large number of employees access to users' video feeds so they could manually identify people and vehicles. The data they gather is used to improve the system's capacity to identify cars and visitors on its own. Like Facebook, which outsources its traumatizing moderation tasks to other countries, Amazon has people transcribing audio in Costa Rica, India and Romania. The project also has workers based in Boston, however.

According to the workers Bloomberg talked to, they sometimes get to listen to recordings with sensitive information or those that were clearly recorded in error. Two of those workers from Romania said they had to listen to what could've been sexual assault. They were apparently told that they couldn't do anything about it, because it's not Amazon's job to interfere.

Amazon has admitted to the publication that it's employing human workers to annotate Alexa voice recordings. A spokesperson defended the company's practices, however, telling Bloomberg that the e-commerce giant only listens to "an extremely small sample" and that its employees do not have access to identifying information:

"We take the security and privacy of our customer' personal information seriously. We only annotate an extremely small sample of Alexa voice recordings in order [to] improve the customer experience. For example, this information helps us train our speech recognition and natural language understanding systems, so Alexa can better understand your requests, and ensure the service works well for everyone. We have strict technical and operational safeguards, and have a zero tolerance policy for the abuse of our system. Employees do not have direct access to information that can identify the person or account as part of this workflow. All information is treated with high confidentiality and we use multi-factor authentication to restrict access, service encryption and audits of our control environment to protect it."

Update 04/12/19 7:33PM ET: Amazon has clarified to us that the recordings they review were captured after the wake word is detected. A spokesperson said: