The European Parliament’s civil liberties committee (Libe) voted yesterday to back proposed legislation for a one-hour takedown rule for online terrorist content which critics argue will force websites to filter uploads.

MEPs on the committee also backed big penalties for service providers that systematically and persistently fail to abide by the law — agreeing they could be sanctioned with up to 4% of their global turnover, per the Commission’s original proposal.

However the committee rejected a push by the EU’s executive for the law to include a so-called ‘duty of care obligation’ under which Internet firms would have had to take proactive measures including using automated detection tools. Critics have suggested this would create a general obligation on platforms to monitor content and filter uploads.

The Libe voted against a general obligation on hosts to monitor the information they transmit or store, and against them having to actively seek facts indicating illegal activity.

“If a company has been subject to a substantial number of removal orders, the authorities may request that it implements additional specific measures (e.g. regularly reporting to the authorities, or increasing human resources). The Civil Liberties Committee voted to exclude from these measures the obligation to monitor uploaded content and the use of automated tools,” it noted in a press release following the vote — which was carried by 35 votes to 1 (with 8 abstentions).

“Moreover, any decision in this regard should take into account the size and economic capacity of the enterprise and “the freedom to receive and impart information and ideas in an open and democratic society”,” the committee added.

Nonetheless, critics argue that a one-hour rule for terrorist takedowns will bring in filters by the backdoor and/or result in smaller websites being forced to operate on larger platforms to avoid having to comply with a stringent, one-size-fits-all deadline.

The Commission set out its proposals for new rules on online terrorist content removals last fall. Though social media platforms have had an informal one-hour rule for taking down illegal content across the region for more than a year.

The draft law seeks to cast the earlier one-hour rule into formal legislation. But it would also apply to any Internet company hosting that receives a takedown notice about terrorist content from a competent national authority — regardless of size. Hence attracting criticism for the burden it could place on smaller website operators.

The Libe committee did make some changes to the proposals aimed at helping smaller websites.

Specifically it decided that the competent authority should contact companies that have never received a removal order to provide them with information on procedures and deadlines — and do so at least 12 hours before issuing the first order to remove content they are hosting.

Commenting in a statement, Daniel Dalton (ECR, UK), EP rapporteur for the proposal, said: “Any new legislation must be practical and proportionate if we are to safeguard free speech. Without a fair process we risk the over-removal of content as businesses would understandably take a safety first approach to defend themselves. It also absolutely cannot lead to a general monitoring of content by the back door.”

However tweeting after the Libe vote, one vocal critic of the draft legislation — Pirate Party member and MEP Julia Reda — argued the Libe’s 12-hour rule will do little to help website owners.

“That’s not even enough time to able to switch off your phone over the weekend,” she wrote, dubbing the proposal “a catastrophe for work-life balance of small business owners and hobbyist websites”.

Only website owners that have never received a removal order before get an extra 12 hours to react – once. That’s not even enough time to be able to switch off your phone over the weekend. A catastrophe for work-life balance of small business owners and hobbyist websites. #TERREG — Julia Reda (@Senficon) April 4, 2019

There is also the question of how online terrorist content is defined.

The Commission proposal says it refers to material and information posted online that “incites, encourages or advocates terrorist offences, provides instructions on how to commit such crimes or promotes participation in activities of a terrorist group”.

“When assessing whether online content constitutes terrorist content, the authorities responsible as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made, including whether it is disseminated for educational, journalistic or research purposes, and the potential to lead to harmful consequences,” runs a Commission Q&A on the draft law from September.

The committee backed protections for terrorist content disseminated for educational, journalistic or research purposes, and agreed with the earlier Commission caveat that the expression of polemic or controversial views on sensitive political questions should not be considered terrorist content.

Though, again, critics aren’t convinced the legislation won’t result in chilled speech across the bloc as platforms and websites seek to shrink their compliance risk.

The European Parliament as a whole will vote on the draft law next week. After which a new parliament — determined via forthcoming elections next month — will be in charge of negotiating with Member State representatives in the Council of Ministers, a process that will determine the final form of the legislation.