



29







312 Shares

The recent ruling in favor of a businessman that wanted Google to delist search results about his past crime is just the latest skirmish between two different value systems on privacy.

The UK’s ruling is based on the concept that after having already served his punishment and showed true remorse, after a certain span of time convictions should be ‘spent’, that the offender becomes a regular citizen that for all intents had never been convicted at all, and thus make it easier to find and maintain lawful employment and dis-incentivize a return to crime.

Google’s argument is based on public interest, and that ability to look up information about previous behaviors is pertinent to future employers and relationships. Both have valid points. It is a rights vs rights argument: privacy vs free speech, freedom against harassment against vs right of others to know.

The Rulings

The recent lawsuit filed against Google actually involved two businessmen who petitioned google to unlink search results to previous legal cases. The first denied petition was about a previous conviction 10 years ago for “conspiring to intercept communications” and had served six months in jail. He plead guilty to the charge. Judge Mark Warby stated that because the first individual had reformed and the crime was less serious, it was no longer relevant information.

The other businessman was convicted for money or informational fraud, for which he had served four years in prison. Judge Warby ruled in favor of Google that it is remains relevant information and search results that include his name with regards to his crime should remain listed.

The Court observed that:

… it may be misleading to label the right asserted by these claimants as the “right to be forgotten”. They are not asking to “be forgotten”. The first aspect of their claims asserts a right not to be remembered inaccurately. Otherwise, they are asking for accurate information about them to be “forgotten” in the narrow sense of being removed from the search results returned by an ISE in response to a search on the claimant’s name. No doubt a successful claim against Google would be applied to and by other ISEs. But it does not follow that the information at issue would have to be removed from the public record … And a successful delisting request or order in respect of a specified URL will not prevent Google returning search results containing that URL; it only means that the URL must not be returned in response to a search on the claimant’s name.

“We are pleased that the Court recognised our efforts in this area, and we will respect the judgements they have made in this case,” was Google’s response to the rulings.

While it can be argued that this particular case had a neutral result, it is most significant by the precedent it sets. The same ruling against Google here could be used against other search engines and will set the tone for future petitions.

What is The Right to Be Forgotten?

In May 2014, the Court of Justice of the European Union established a RTBF (Right to Be Forgotten). It allows Europeans to request that search engines delist links present in search results containing an individual’s name, if the individual’s right to privacy outweighs public interest in those results. The delisted information must be “inaccurate, inadequate, irrelevant or no longer relevant, or excessive in relation to those purposes and in the light of the time that has elapsed.” The ruling requires that search engine operators conduct this balancing test and arrive at a verdict.

–Three Years of the Right to be Forgotten white paper by Google

The particular precedent for this case was May 2014 EU Court judgement of Google Spain vs Agencia Española de Protección de Datos (AEPD) that decided that individuals do have a right to request search engines to remove links to webpages when the individual’s name is used as the search entry. Google does not have a journalistic exemption to the Act 1 of the General Data Protection Regulation.

The data subject shall have the right to object, on grounds relating to his or her particular situation, at any time to processing of personal data concerning him or her which is based on point (e) or (f) of Article 6(1), including profiling based on those provisions. 2The controller shall no longer process the personal data unless the controller demonstrates compelling legitimate grounds for the processing which override the interests, rights and freedoms of the data subject or for the establishment, exercise or defence of legal claims .

The Right to Object is meant to be universal across all the EU, as it defines protections against the use of personal data for marketing and profiling. Contrary to expectations, most petitions are not to erase mentions of previous crimes. References to Crime and Professional Misconduct only comprise 6.1% and 5.6% respectively. (1/21/2016 – 04/26/2018 data)

By comparison, requests to remove non-sensitive Personal and Professional Information comprise 5.5% and 18.5% respectively. Another 4.5% on the Google transparency report could be the removal of sensitive personal information. A further 7.6% are requests to remove self-authored works.

As can be seen on the link, 88% of all requests come from private individuals. Delisting is not just about reputation management.

While removing personal histories might feel disingenuous, one has to remember that it has been less than two decades since social media has exploded into the global consciousness. Prior to the wide mainstream use of Internet, people had a general expectation of privacy. People have the right not to be exposed onto global scrutiny without their consent. People have a right to feel safe from being tracked or harassed by strangers. People have a right to object to false use of their identities. People have a right not to allow the stupid things they’ve said or done while teenagers or otherwise debilitated to haunt them later in their lives. People have a right not to have their personal information used for marketing purposes.

To sum it up, people have a ‘right to be forgotten’, a ‘right not to have their personal information be collected, remembered, and made use of by entities without their awareness of permission’.

Individuals cannot force government or journals to erase their content, but as a data processor they can petition Google to make it harder to find. For legal and investigative purposes, a background search could still be accomplished with other services designed for such activities.

The extremely broad net that Google and seach engines casts of the visible Web means that they are the primary tool to enable this right. The requester must file reasons for why the information must be de-listed. It is not automatic for Google however, and an argument can be said that this levies excessive obligation to treat each petition on a case by case basis.

What’s so bad about this?

Losing the ‘landmark’ Right to Be Forgotten case in the UK sets the precedent that if Google refuses to delist the information, individuals may appeal to their local privacy court. This may have unfortunate consequences if it becomes a common recourse.

The most obvious, of course, is that the appeals process moves the judgment for delisting to the courts, and so Google and the petitioner will have to present their arguments. Every court case makes it more annoying, more expensive, and slower for literally everyone involved.

The Streisand Effect may in the cause of attention to hide information make it more public instead.

This sets up a precedent of the local government having primacy over what is allowed to be listed under public interest. Sure, it might not sound as bad when it’s done by a court looking out for general public welfare. But the fact the Google can be ordered to delist by governments even though they have internationally located servers mean that less benevolent, less democratic governments can exercise more complete information control to the detriment of their own citizens.

Perhaps the most necessary but worst consequence is that the sheer load of petitions received and the need to evaluate each of them on a case by case basis means that automation might be the only way out for Google. Unfortunately, as proven by many other systems for automatic verdicts (such as Youtube’s whole mess of a copyright takedown system), software are… dumb. As much as it will make sending and processing petitions faster, it will very probably also make mistakes so much faster and more widespread, causing extra complaints and headaches for Google and all related websites.

The right to free speech means the right to let the public know true information that assists their interests. There are those that fear this may be a serious blow to the freedom of information that enables the Internet to be platform for advocacy for justice and liberty. On the other hand, the Internet has a proven record of abusing and harassing people for their lapses of privacy.

This is not an easy question to solve. There’s few wars as destructive as when good intentions come head to head. The best we can hope for is that Google and the EU come to a compromise in a way that could serve as a useful template for other countries and their people’s expectations for safer online interactions.