Eavesdropping is a top consumer concern with smart speakers, according to a PCMag survey, and for good reason: time and again, it turns out that voice-activated digital assistants like Alexa, Siri, and Google Assistant are not only recording users' conversations but letting other people listen.

The latest instance involves Apple's Siri. Following a report in The Guardian last week that Apple hired contractors to listen to recorded Siri conversations for accuracy and quality, the tech giant has temporarily suspended its Siri grading program. Apple also said users will be able to opt out of the program going forward.

Using human reviewers to help improve voice assistant software isn't a new practice; Amazon has thousands of employees reviewing voice recordings to "improve customer experience," and Google does the same. The problem is, by their very nature, these workers invade user privacy. The report in The Guardian states that Apple contractors regularly overheard confidential or sensitive conversations, including users' medical information, drug deals taking place, and people having sex.

In a statement to The Verge, an Apple spokesperson said "We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading."

The anonymous Apple employee who blew the whistle on the practice told The Guardian that these breaches of customer privacy often stem from accidental activations of the smart assistant not triggered by the "Hey Siri" wake word. The grading program conducted by outside contracting firms (who are bound by confidentiality agreements) is designed to decrease these types of accidental triggers.

The whistleblower said these recordings are linked not only to app data, but contact details and location as well. In a statement responding to the allegations, Apple confirmed that a "small portion" of Siri requests are reviewed, but said these recordings are not tied to Apple IDs.

While Apple said it will pause the program, it did not specify how long or say whether it would change its storage practices. The company currently says it stores recordings on its servers with personal data attached for six months, and may hold anonymized recordings for more than two years.

Apple touts itself as the privacy-focused tech company, but its pledge to finally offer an opt-out in the wake of the report shows how far behind the device maker is on this front. Amazon and Google already offer privacy opt-outs for customers who don't want the companies listening to their smart speaker queries.

Siri isn't the only voice assistant under privacy scrutiny this week. In Germany, the Hamburg Data Protection Authority ordered Google to stop human review of Google Assistant recordings under GDPR regulation following a data leak last month. Google has responded that manual voice review is critical to the development of the technology.

Further Reading

Speaker Reviews