Apple is globally suspending the use of Siri voice recordings after reports emerged that contractors hired by the company listen to them to improve the voice assistant.

TechCrunch, which broke the news, said Apple plans to issue a software update to seek users’ consent before letting them participate in the product improvement program.

“We are committed to delivering a great Siri experience while protecting user privacy,” the company said to the outlet. “While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”

In a similar development, Google said it will voluntarily halt listening in and transcribing Google Assistant recordings for three months in the EU, according to German data regulators.

“The use of automatic speech assistants from providers such as Google, Apple and Amazon is proving to be highly risky for the privacy of those affected,” the German data protection authority said. “This applies not only to people who run a speech assistant, but to all those who come into contact with it, for example if they live in a household in which devices such as Google Assistant are installed are used.”

Last week, The Guardian reported how Siri voice snippets containing “medical information, drug deals, and recordings of couples having sex” are being heard by contractors working for the company around the world.

The objective, like with the case of Amazon and Google, is to listen to them and grade the assistant’s response, including to check if it was invoked by mistake, and if the response was appropriate. The snippets themselves are not associated with your Apple ID.

Although Apple said less than one percent of daily requests are being analyzed to improve Siri and dictation, the contractor said “they were motivated to go public about their job because of their fears that such information could be misused.”

The Cupertino-based tech giant currently provides no way to know which of your Siri recordings may have been saved for review by employees. This is something users should have explicit control over, but instead it’s buried deep inside ambiguous, long and obtuse legalese.

Apple’s own terms of service document says pseudonymized Siri requests could be used for quality control. But it stops short of explicitly stating that the work is actually undertaken by humans.

What’s more, the company’s response that it’ll provide an opt-out for grading process effectively means your voice snippets — albeit anonymized — could still potentially wind up on its servers to improve Siri.

Complicating the matter further is the lack of comprehensive user controls over the kind of data it collects. Unlike Google, Amazon, Microsoft, and Facebook — each of which have their own privacy and activity logs to review and manage your information — Apple doesn’t offer a one-stop shop for your privacy needs.

On the other hand, the iPhone maker is far from the only company to employ human oversight of its voice assistants. Audio recording requests made to Amazon Alexa and Google Assistant are also reviewed in a similar fashion.

Google, mid-July, acknowledged voice snippets from the Assistant — leaked to the Belgian news outlet VRT News — revealed sensitive information such as medical conditions and customer addresses.

It’s an industry-wide practice to collect voice recordings to improve speech recognition algorithms, as it’s very much an evolving technology and a human element, however creepy, is essential to train the software to understand people better and to develop new features, such as providing product recommendations based on their interactions.

But seeking informed consent from users goes a long way towards addressing privacy concerns, not to mention ensuring compliance with EU GDPR requirements.

The suspensions come amid increased regulatory scrutiny of big tech’s business models, forcing them to be more transparent about their data collection, processing, and sharing practices.

If anything, the recent controversies surrounding voice assistants merely scratch the surface of how a world full of always-on microphones can be a privacy nightmare.

Apple has long positioned itself as a paragon of privacy in hopes to differentiate from its data-hungry rivals. But the company needs to send a clear signal about its privacy guarantees, and instill accountability and trust as it increasingly shifts to services for revenue and growth.

Update on Aug 3, 2019 8:00 AM IST: After Apple and Google, Amazon is now following suit by giving Alexa users an explicit opt-out setting — called “Help Improve Amazon Services and Develop New Features” — that makes it clear “your voice recordings may be used to develop new features and manually reviewed to help improve our services.” The change was first reported by Bloomberg.

Read next: There’s now a cryptoart lottery using Bitcoin blockchain to pick a winner — and it’s brilliant