Europe's highest court issued a controversial ruling Thursday with the potential to have staggeringly large implications worldwide. The Court of Justice of the European Union held that Facebook and other social platforms are not only obligated to proactively identify unlawful content but also to block it worldwide if a single country's authorities demand it.

The ruling (PDF) stems from a case that began in Austria three years ago. A Facebook user posted comments about an Austrian politician, Eva Glawischnig-Piesczek, that Austrian courts found to be illegally defamatory. Glawischnig-Piesczek in 2016 wrote to Facebook Ireland, the company's EU headquarters, asking the company to delete the comments and limit access to them globally. Facebook refused, Glawischnig-Piesczek sued, and the results of the years of legal wrangling are out today.

A service is not liable for information it's hosting "if it has no knowledge of its illegal nature or if it acts expeditiously to remove or disable access" to the illegal content as soon as it becomes aware of it, the court said; the United States operates under a similar standard. The EU's directive on electronic commerce also "prohibits any requirement for the host provider," meaning a company such as Facebook, "to monitor generally information which it stores or to seek actively facts or circumstances indicating illegal activity," the court said.

But that directive does not preclude an EU member nation from ordering a service to remove or block access to content that is identical or equivalent to content that has been deemed unlawful in the past, the court ruled. Nations can require, the court said, the use of automated technologies and filters to make it happen. Crucially, the directive also does not prohibit EU member nations from requiring platforms to remove or block access to such information worldwide, "within the framework of the relevant international law."

The globe-spanning ruling is the opposite approach the court took less than two weeks ago in a different case, when it held that the so-called right to be forgotten under EU law does not require Google to make certain information inaccessible to the world beyond European borders.

Billions and billions...

There are about 2.4 billion Facebook users worldwide in all but three countries. About 385 million Facebook users are in Europe. The EU's ruling gives outsized power to the body's 28 member states to set the terms of access and limit content for the other 2 billion Facebook users in the world, Facebook said.

The ruling "raises critical questions around freedom of expression and the role that Internet companies should play in monitoring, interpreting, and removing speech that might be illegal in any particular country," Facebook said in a written statement. "It undermines the longstanding principle that one country does not have the right to impose its laws on speech on another country."

The devil will also be in the details, Facebook noted. "In order to get this right, national courts will have to set out very clear definitions on what 'identical' and 'equivalent' means in practice. We hope the courts take a proportionate and measured approach, to avoid having a chilling effect on freedom of expression."

The ruling casts a fairly broad net over "equivalent" content. It finds that illegality does not necessarily refer to a specific phrase or sentence but rather "information conveying a message the content of which remains essentially unchanged." So content "essentially conveying the same message" but "worded slightly differently, because of the words used or their combination," as compared to the content that was initially deemed illegal, would have to come down.

“Filters can’t understand context”

Daphne Keller of Stanford's Center for Internet and Society published a white paper (PDF) in September analyzing the potential outcomes from such a ruling.

Aside from the glaring issue that speech and defamation laws vary wildly from one nation to another, the ruling "will indirectly but seriously affect Internet users' rights, including rights to privacy and freedom of expression and information," Keller wrote. As opponents noted, she added, "filters can't understand context. That means if text, images, or videos violate the law in one situation, filters will likely also block the same material in lawful uses like parody, journalism, or scholarship.

"Laws that let courts in one country reach across borders to take down expression protected in another, or laws that lead tech companies to erect digital borders, have consequences for everything from foreign relations to competition and trade," Keller observed.