Amazon accidentally sent 1,700 recordings of someone speaking to Alexa to the wrong person, according to a German magazine.

The magazine said that the recordings had lots of personal information and that it was easily able to find the person whose data was leaked.

The episode underscores that Amazon stores audio files when you speak to Alexa.

"This was an unfortunate case of human error and an isolated incident," Amazon said in a statement. "We have resolved the issue with the two customers involved and have taken steps to further improve our processes. We were also in touch on a precautionary basis with the relevant regulatory authorities."

Imagine you have Amazon Alexa-enabled speakers all over your house. Perhaps you have one Echo in your living room and an Amazon Fire Stick connected to your TV. Maybe you talk to Alexa to set alarms, control your smart home, and play music in the shower.

Then one day, over 1,700 recordings of you speaking to Alexa are sent to a random person — and you don't even know about it until a magazine gets in touch.

Apparently this actually happened, included in a scary story coming out of Germany. C't magazine reported on Thursday that someone requested their personal data from Amazon and was shocked to discover 1,700 audio files of someone he didn't know talking to Alexa.

These files even included audio recordings of the person in the shower, according to the report.

He provided those recordings to the German magazine, which was able to get in touch with the person who owned the house full of Amazon Alexa devices. The magazine said the audio files, in German, revealed a lot of information about the person, including where he lived, first and last names, who his partner is, and his taste in music.

It turns out that Amazon had not contacted him about the data breach, but as his story was about to become public, Amazon gave him new Echo devices and a Prime membership, according to the report.

The story underscores that Amazon does record and store your voice when you speak to Alexa. You can check what you've said to Alexa at Amazon.com/alexaprivacy and delete portions or the entirety of the stored audio files.

Amazon says it needs to store these recordings to improve its voice-recognition systems, but people who frequently speak to their smart speaker should think twice before telling Alexa any secrets.

"This was an unfortunate case of human error and an isolated incident," an Amazon representative told Business Insider. "We have resolved the issue with the two customers involved and have taken steps to further improve our processes. We were also in touch on a precautionary basis with the relevant regulatory authorities."

Amazon also confirmed that it had apologized to the two customers and had been in touch with regulatory authorities including the European Union's General Data Protection Regulation administrators.