Smart devices have changed the face of privacy forever. Telling people that you believed someone was listening to you at all times used to be cause for concern, but when everyone has a phone in their pocket and a computer on the desk and an Amazon Echo or a Google Home in their living room, it’s simply the truth.

Although there are safeguards against your private conversations getting into the hands of unwanted parties, you can’t say for sure that the words you speak out loud are truly private. And there’s more proof of that every day.

This week, Dutch publication VRT NWS shared a report detailing Google’s practice of hiring independent contractors to listen to and transcribe Google Assistant audio recordings.

The what of this story is just as interesting as the how, though, as VRT NWS says Google allowed employees from one of its subcontractors to listen to more than 1,000 clips recorded by Google Assistant provided for the contractors to transcribe. However, at least 153 were “conversations that should never have been recorded” because the wake phrase “OK Google” was not used.

Although Google disconnects the audio clip from any identifiable information, VRT NWS also found that it was on more than one occasion possible to put the pieces together and find the source of the clip by using context clues scattered throughout the recordings.

As for why Google is sending audio recordings to subcontractors, the employees explain that their transcriptions are used to improve Google Assistant’s ability to understand and respond to commands.

The transcribers are said to write down “every cough and audible comma,” as well as attempt to identify whether a man, a woman, or a child is the one speaking. Google uses these transcriptions to make Google Assistant smarter.

Google is (somewhat ironically) less than pleased with the breach of privacy and shared the following statement with VRT NWS expressing its displeasure with the leak and explaining its methods:

We work with language experts around the world to improve speech technology by making transcripts from a small number of audio clips. This work is crucial for the development of technology that makes products such as the Google Assistant possible. Language experts judge only about 0.2 percent of all audio clips that are not linked to personally identifiable information. We have recently learned that one of these language experts may have violated our data security policy by leaking Dutch-language audio clips. We are actively investigating this and when we find a breach of our policy, we will take action quickly, up to and including the termination of our agreement with the partner.

The fact that this story could be written at all speaks volumes about how easily our data can end up in the wrong hands.

If the audio recorded by Google was secure and protected, a Dutch news site wouldn’t have gained access to a treasure trove of what many would consider extremely sensitive data. Unfortunately, until these companies change their practices, it’s going to be the reality we live in.