Ry Crist/CNET

One minute, you're relaxing at home discussing the merits of hardwood floors. The next, a contact is calling you and telling you you're being hacked.

That's how a family in Portland, Oregon, is describing their experience with Amazon's Alexa after the popular voice assistant reportedly sent audio of a private conversation to one of their contacts -- all without them knowing it.

The family had filled their home with Alexa devices to control their lights, HVAC and security system using voice commands. Then, they say that one of the father's work contacts called to let them know that he'd received an Alexa call broadcasting audio of a private conversation about flooring -- a call the family says they never asked Alexa to make.

Now, they tell a Seattle news station that their Alexa devices are left permanently unplugged.

Now playing: Watch this: Amazon confirms Echo shared Oregon family's private audio

"I felt invaded," said Danielle, who didn't want her last name used. "A total privacy invasion. Immediately I said, 'I'm never plugging that device in again, because I can't trust it.'"

Danielle says that the family got in touch with Amazon's engineering department, who were able to confirm that audio had indeed been unintentionally broadcast.

"They said, 'Our engineers went through your logs, and they saw exactly what you told us, they saw exactly what you said happened, and we're sorry,'" Danielle told KIRO 7.

"He apologized like 15 times in a matter of 30 minutes and he said, 'We really appreciate you bringing this to our attention, this is something we need to fix!'"

Danielle says that Amazon was unable to pinpoint exactly what had caused the unintentional broadcast, but the company now tells CNET that Alexa mistakenly heard the wake word, then mistakenly heard a command to call someone, complete with a third misheard confirmation.

"Echo woke up due to a word in background conversation sounding like 'Alexa,'" a spokesperson tells CNET. "Then, the subsequent conversation was heard as a 'send message' request. At which point, Alexa said out loud, 'To whom?' At which point, the background conversation was interpreted as a name in the customer's contact list. Alexa then asked out loud, '[contact name], right?' Alexa then interpreted background conversation as 'right.'

"As unlikely as this string of events is, we are evaluating options to make this case even less likely."

Now playing: Watch this: Hey Amazon, show us how you are protecting children's...

By Amazon's own admission, that's an awful lot of misheard commands, which paints this as something of an elaborate "Alexa butt-dial." Whether or not that's enough to dial down concerns about the privacy risks of always-listening devices is another question -- more details on the effort to make Alexa better at recognizing and dismissing false positives would certainly be a good start.

Editor's note, 2:15 p.m. PT: Updated to include comment from Amazon.

Correction, 1:40 p.m. PT: This story initially misstated the state where the family lives. They live in Oregon.

HomePod, Echo, Google Home: How secure are your speakers?

The best PCs for privacy-minded people: Even Mark Zuckerberg covers his webcam.