In May, the European Union’s top court made the controversial ruling that search engines were responsible for upholding a so-called “right to be forgotten,” compelling Google, Bing, Yahoo, and others to cease indexing and displaying links to web pages that are “inadequate, irrelevant or no longer relevant” to a person making a complaint. This is not globally enforceable, of course, and applies only to the EU court’s jurisdiction.

Today, the Wikimedia Foundation, of which Wikipedia is a part, reported that its pages were among those being removed from Google’s indexes, and let it be known they were not pleased. In a blog post today their legal representatives wrote (with emphasis mine):

As of July 18, Google has received more than 91,000 removal requests involving more than 328,000 links; of these, more than 50% of the URLs processed have been removed. More than fifty of these links were to content on Wikipedia.

That’s only the beginning of the problem, as the only reason the Wiki folks even know about these removals is because Google tells them, of their own volition.

Search engines have no legal obligation to send such notices. Indeed, their ability to continue to do so may be in jeopardy. Since search engines are not required to provide affected sites with notice, other search engines may have removed additional links from their results without our knowledge. This lack of transparent policies and procedures is only one of the many flaws in the European decision.

Since search engines are under no obligation to let anyone know what they’re not showing users, users have no way of knowing what they’re missing, or that there’s anything to miss. That’s the idea of the new rule, really, to “erase the memory” of the Internet to uphold some twisted notion of “fairness.”

Wikimedia’s executive director Lila Tretikov, in a separate post today, explained the stakes:

[T]he European court abandoned its responsibility to protect one of the most important and universal rights: the right to seek, receive, and impart information. As a consequence, accurate search results are vanishing in Europe with no public explanation, no real proof, no judicial review, and no appeals process. The result is an internet riddled with memory holes—places where inconvenient information simply disappears.

A few days ago, I wrote about the concept of fairness versus compassion in software, an idea of Ben Brooks’. The gist was that “fairness” is where decisions are made with the lowest common denominator in mind in order to appeal to every possible use case, and “compassion” is where products and solutions are developed on a case-by-case basis, with each product fulfilling a limited set of needs, and doing so very well, at the expense of other needs, which are served by other products. Fairness gets you Microsoft Word, full of features for every possible scenario but also byzantine and bloated, and compassion gets you OmWriter, simplified with a small set of tools that will integrate extremely well with a small set of users.

It seems to me that the European Court was trying to be fair. There is a chance that a number of people may indeed be legitimate victims of content on the Web that is truly “inadequate, irrelevant or no longer relevant,” and that this content is genuinely harmful to them, to no greater purpose. And that sucks. But it wasn’t enough to simply act on the complaint in question (in this case, a Spanish man who wanted an auction listing from the 90s removed), so in all fairness, the Court decided that a blanket rule for such removals had to apply across the board, for every EU citizen, and for every search engine doing business there. To limit the scope of the case to one man wouldn’t be, to their minds, fair.

But that attempt at fairness has sacrificed compassion, compassion for the human beings who are now denied access to information once freely available, and who now have no way of knowing what it is they’re being denied. Opaque internal tribunals make the decisions on all of these cases, and as has been reported, there are at least tens of thousands of them, and likely far more. Compassion would have had individual cases of serious merit addressed, but fairness has harmed, potentially, everyone in the EU.

And the fact that it’s hitting Wikipedia of all sites so hard makes it all the more salient. Wikipedia, like the Web itself, has grown organically to become the very historical memory of the Internet. Like a real person’s memory, it is flawed and prone to gross human error, but also the has benefit of human ingenuity and imagination, and we rely on it for better or worse. Google and other search engines are largely how we find everything on the Web, including Wikipedia. It’s like cutting off the neural connections to memories that are stored in your brain, they’re there, but you now can’t access them because one of those memories is of something someone else would rather you forgot.

And think: couldn’t the right to be forgotten apply to physical media? Should we prune print encyclopedias and start rummaging through libraries with pairs of scissors, hunting for information deemed “inadequate, irrelevant or no longer relevant”?

Perhaps Wikimedia’s public stance will help generate some movement against the “right to be forgotten.” It’s one thing to wave people off of stupid things one might have done on an auction 20 years ago. But blocking off access to the memory of the Web might finally make people anxious.

UPDATE: Tim Farley has some important advice to anyone who makes content on the Web regarding this issue that he put in the comments: