In a new data security scandal twist, Google has sparked concerns over its outsourcing the review of filed exchanges between users and its digital assistant, for the company to gather data on various accents and dialects for more effective speech recognition. EU regulators, however, argued many of them were recorded unbeknownst to those affected.

German authorities have forced Google to temporarily suspend its practice of manually reviewing audio recordings that help test its Google Assistant, according to a statement from a German data-protection institution, following a move by the Hamburg Commissioner for Data Protection and Freedom of Information to investigate the tech giant’s handling of those files.

The cessation of manual reviews that applies to the whole of the European Union, and arguably internationally (though it hasn’t been confirmed by a Google spokesperson) comes as a preventive measure to protect citizens in the event that the three-month investigation finds Google to have been violating the General Data Protection Regulation, commonly known as the GDPR:

"The use of automatic speech assistants from providers such as Google, Apple and Amazon is proving to be highly risky for the privacy of those affected”, Thursday's report from the Hamburg Commissioner read. "[The ban] is intended to provisionally protect the rights of privacy of data subjects for the time being”.

Google halts Assistant recording transcription in the EU for at least three months https://t.co/Z0Y7X3Yy6d pic.twitter.com/1CQUaFZOQP — Android Police (@AndroidPolice) August 1, 2019

In a statement to The Verge, Google made it clear that they’ve already halted what they refer to as “language reviews” conducted by outsource staffers, and are working with German authorities on the best policies to pursue to help customers understand in what way their data may be exploited.

In a bid to defend their review of audio materials, Google asserted the practice helps make voice recognition systems more inclusive of various accents and dialects of languages, while assuring that they don’t associate audio clips with concrete user accounts during the process and “only perform reviews from around 0.2% of all clips”.

Earlier this month, Google sparked a storm of criticism after a Belgian news outlet reported about thousands of personal recordings ending up (with many customers unaware of the practice) on Google Assistant installed on Google Home devices and smartphones.

A Google spokesperson told Business Insider at the time that these instances of "false accepts" — or, switching on the Assistant without a distinct user command like "Hey Google" — were rare, arguing that the tech titan has multiple protections to prevent this from happening. The spokesperson, however, would not comment on the high number of "false accepts" from the media report.

The leaked recordings were provided by a Google employee who exposed Google’s operations with regard to customers’ files that are currently protected by the GDPR, which effectively replaced the 1998 Data Protection Act.

The abovementioned GDPR came into force in May 2018, stipulating whopping fines for companies - up to 4 percent of their annual global turnover- in the event that customers’ data security is not appropriately guaranteed.