The EU is not only cracking down on copyright-infringing content, there's a strong focus on terrorist material too. The EU Commision recently proposed new regulations that would require hosting platforms to remove terrorist content within one hour, or face consequences. This week member states gave the plan a green light, which goes well beyond Article 13.

The ‘upload filters’ topic has been widely debated in the European Parliament this year.

While most attention has been focused on copyright-infringing material and Article 13, another filtering discussion has been going on at the same time.

This summer the EU Commission pushed forward a plan to require content hosting platforms including Google, Twitter, and Facebook to swiftly remove terrorist content when a national authority points it out to them.

The proposed regulation was accepted by the EU member states at a Council meeting earlier this week.

According to the published report, the EU countries in favor believe that there is a “need to achieve automatic detection and systematic, fast, permanent and full removal of terrorist content.”

This “terrorist content” can be reported by local police, for example, or another designated private authority. No court has to be involved in the process. What has many activists worried is that, like Article 13, the proposal can enable automated upload filters.

The proposed terrorist content filters will go further than Article 13 in the sense that they require services to remove reported content within one hour. In addition, services will have to prevent this content from reappearing on their platforms.

If they fail to do so, the companies could face hefty fines and criminal liability.

It’s worth noting that the proposal is not restricted to large Internet platforms. It will apply to all hosting service providers that do business in the EU. This includes many smaller companies.

The French civil rights group La Quadrature notes that while it’s easy for large tech giants to comply, smaller competitors will be severely disadvantaged. These would all need a point of contact that’s available 24/7.

“The other actors will have no other choice but to close down their hosting services or (less likely but just as serious) to outsource the execution of their obligations to the giants,” the group writes.

The proposed legislation has triggered opposition from more sides, including various public interest groups, the UN’s Special Rapporteur David Kaye, and several politicians.

Patrick Breyer, a Pirate Party candidate for next year’s European elections, warns that Internet censorship is not the way forward, especially if private actors get to decide what content must be removed.

“No court order is required to block content. This could put our freedom of expression and information in the hands of the Hungarian Ministry of the Interior or a local police officer in Romania, for example, which is unacceptable,” Breyer’s team warned previously.

While the report that was agreed on this week only provides a “recommendation,” as La Quadrature highlights, it’s a declaration of principle without legal consequences.

The green light from member states is the first step in a long process and the proposed language has yet to be negotiated with the European Parliament. If adopted, individual member states will decide what penalties are appropriate. These can reach 4% of a company’s global turnover.

Among all the criticism, there is also support for the proposal. Austria’s minister of the interior Herbert Kickl believes that it will help to protect European citizens.

“Online terrorist content has played a key role in almost every terrorist attack in Europe. It is our duty to protect citizens as effectively as possible. With this agreement, Internet companies should be clearly signaled that there is an urgent need for action,” Kickl said.