With the GDPR’s deadline now almost upon us, one of the most talked about provisions has been the ‘Right to Erasure’ contained within Article 17.

Significantly expanding the ‘Right to be Forgotten’ doctrine established in the Google Spain case, Article 17 allows data subjects (i.e. you and I) to submit takedown requests to any organisation that collects and controls information on them.

There are a number of grounds under which people may seek to have data deleted, which cover a broad variety of circumstances. These include situations where the data is no longer necessary for the reasons it was collected; where it was unlawfully processed; where the subject withdraws their consent; as well as some others. The right is not unlimited, with exceptions where the collection and processing of the data is necessary in the exercise of the right to freedom of expression; where there is a specific legal obligation to retain the information; for reasons of public interest; etc.

Issues with Article 17

Despite some initial reservations, the GDPR (and Article 17 in particular) has generally been lauded as a victory for European citizens, who will gain far more control over what information companies hold on them than they ever previously have had. This is especially true given the arguably extra-territorial applicability, where any organisation that handles European data will be expected to comply.

However, there are a few specific issues arising from the construction of Article 17 that bear some further scrutiny. Rather than analyse the philosophical criticisms of the Right to Erasure, below I briefly look at some of the practical considerations that will need to be taken by data controllers when they receive such a Request for Erasure:

Verification. Abuse, and a lack of formal requirements for removal requests. Article 85: Freedom of expression.

Verification of the Data Subject

Before giving effect to an Article 17 request, the controller must use all ‘reasonable measures’ to identify the identity of the requesting party. It is perhaps obvious that an organisation should not be deleting the accounts or other data of somebody without checking first to make sure that the person making that request is authorised to do so. However, this leaves open a number of questions about what this kind of verification will look like. In other words, what steps will be considered ‘reasonable’ under the terms of the law? Will courts begin to see arguments over online platforms account recovery procedures as a result of a denial of access to the fundamental right of privacy via the GDPR? What metrics will a data subject be able/expected to provide in order to discover their associated data? i.e. while it might be easy to request information relating to your e-mail address, what about other identifiers such as IP addresses, or names? These are questions that do not have clear answers, and will inevitably lead to an uneven application of the law, dependent on the situation.

Abuse, and a Lack of Formal Procedural Requirements for Erasure Requests

It should be self-evident at this stage that any statutory removal mechanisms will be open to abuse by parties determined to have content removed from the Internet, and in that regard, Article 17 is no different. However, there is a common misconception that the Right to Erasure gives people the right to stop any mention of them online – especially speech that is critical of them, or that they disagree with. This is not the case, and Article 17 is not crafted as a dispute resolution mechanism for defamation claims (that would be the E-Commerce Directive). These facts don’t stop people from citing the GDPR incorrectly though, and it can quickly become difficult to deal with content removal demands as a result.

The problem is compounded by the fact that there are no formal procedural requirements for an Article 17 request to be valid, unlike the notice and takedown procedure of the DMCA, or even the ECD. Requests do not have to mention the GDPR, or even Right to be Erasure specifically, and perhaps even more surprisingly, the requests don’t have to be made in writing, as verbal expressions are acceptable.

While the reasons for the lack of specific notice requirements is clearly in order to give the maximum amount of protection to data subjects (the lack of requirement for writing was apparently in order to allow people to easily ask for the removal of their data from call centres over the phone), it seems to ignore the accompanying problems with such an approach. The lack of clarity for the general public around what exactly the Right to Erasure includes, along with the lack of procedural checks and balances means that it will be increasingly difficult for organisations to identify and give effect to legitimate notices. This is especially true for online platforms that already receive a high number of reports. While many of these are often nonsense or spam, they will require far greater scrutiny in order to ensure that they aren’t actually badly worded Article 17 requests that might lead to liability.

If we look at the statistics on other notice and takedown processes such as that in the DMCA (the WordPress.com transparency report, for example), we can see that the levels of incomplete or abusive notices received are high. The implementation of even basic formal requirements would provide some minimum level of quality control over the requests, and allow organisations identifiers to efficiently categorise and give effect to legitimate Article 17 requests, rather than the prospect of having to consider any kind of report received through the lens of the GDPR.

Article 85: Freedom of expression

As mentioned earlier, a controller is not obliged to remove data where its continued retention is ‘necessary for reasons of freedom of expression and information’. The obvious question then becomes under what grounds this should be interpreted, and we find some guidance in Article 85 of the GDPR. Unfortunately however, it doesn’t say all that much:

‘Member States shall by law reconcile the right to the protection of personal data pursuant to this Regulation with the right to freedom of expression and information, including processing for journalistic purposes and the purposes of academic, artistic or literary expression.’

This appears to leave the task of determining how the balance will be made to individual Member States. Whilst this isn’t unusual in European legislation, it means that the standard will vary depending on where the organisation is based, and or where the data subject resides. At the time of writing, it isn’t clear how different Member States will address this reconciliation. Despite freedom of expression’s status as a fundamental right in European law, it is afforded scant consideration, and thus weak protection under the GDPR, preferring to defer to national law, which simply isn’t good enough. Far stronger statements and guarantees should have been provided.

Over Compliance

Unfortunately, the amount of extra work required to analyse and deal with these requests as a result of the law’s construction – along with the high financial penalties detailed in Article 83 – mean that it is likely that many organisations will simply resort to removing data, even where there is no lawful basis for the request, or requirement for them to do so.

We may fairly confidently speculate that the response from many data controllers will be to take a conservative approach to the GDPR’s requirements, and thus be less likely to push back on any potentially dubious requests as a result. Insistent complainants may find that they are able to have speech silenced without any legitimate legal basis simply out of fear or misunderstanding on the part of third party organisations.

With a well publicised and generally misunderstood right to removal, lack of procedural requirements, and a reliance on intermediaries to protect our rights to freedom of expression, we may find ourselves with more control over our own data, but with far less control over how we impart and receive information online.

—

Header image by ‘portal gda‘ on Flickr. Used under CC BY NC-SA 2.0 license.