Just before the summer holidays, on 5 July, the European Parliament (EP) Plenary decided to reject the EP’s Legal Affairs (JURI) Committee’s negotiation mandate with the Council on the copyright reform. This decision implies that now all 751 Members of the European Parliament (MEPs) get to have their say on this important issue. MEPs are now scheduled to vote on this file during the 12 September Plenary session, and the deadline for Amendments was set to 5 September (13h CET).

With this deadline around the corner, MEP Axel Voss (EPP, Germany), the JURI Committee Rapporteur, circulated end of last week a set of compromise proposals for Articles 11 (ancillary copyright) and 13 (the #CensorshipMachine). His ambitions are two-fold: (1) he’s trying to find a majority in order to limit the attempts from other MEPs to push their own alternative amendments on these Articles, and as such hijack the process, and (2) by giving a ‘piece-offering’ on these Articles, he’s trying to convince colleagues to keep the lid closed on other Articles and to only focus on these two core issues. However, MEP Voss is actually continuing the strategy he used during the JURI negotiations (and for which the Estonian and Bulgarian Council Presidencies also became renowned): making things worse with every version.

On Article 13, MEP Voss considered that its his turn now to go beyond maximalist, as he opted to present the nuclear option: full platform liability, without any user safeguards. In a tweet announcing his proposal, he tried to focus on the fact that his proposal no longer refers to the need for platforms to take any measures (i.e. an upload filter). However, he neglects to highlight that, in practice, making platforms directly liable leaves them no other choice than to apply filtering mechanisms, coupled with terms and conditions allowing them to arbitrarily remove content, in order to avoid any liability risks. Moreover, by stripping any user safeguards, he actually succeeded in making things even worse. Luckily, various MEPs, such as MEP Daniel Dalton (ECR, UK) and MEP Julia Reda (Greens/EFA, Germany), quickly saw through this scheming plan.

MEP Jean-Marie Cavada, the French ALDE Group’s Shadow Rapporteur, saw this as an opportunity to pretend to ‘save the day’ and push forward his alternative version of Article 13. MEP Cavada’s proposal is however equally bad but in a more subtle manner: while it seems more reasonable at first glance, after a closer reading the true nature of the beast emerges and it lies very close to MEP Voss’ proposal in its outcome, namely the direct liability of platforms for all uploaded content. More detailed comments on both proposals below.

On Article 11, MEP Voss hasn’t budged an inch: he is still holding on to the ancillary copyright solution, instead of moving towards the alternative on the table, namely a presumption of rights. His stubbornness on this matter is quite shocking, seeing that early June over 100 MEPs, from different political groups, addressed an open letter to him opposing such a new publishers’ right. Moreover, lets not forget the overwhelmingly opposition by consumer groups, small publishers, civil society, the business community, over 200 European copyright legal and academic experts, and even an an independent study [PDF] conducted for the JURI Committee (our full story on it). More detailed comments on MEP Voss’ Article 11 compromise amendment below.

The proposals on the table clearly show that fight to #SaveYourInternet is far from over, so get in touch with your MEP today!

Article 13 – The #CensorshipMachine

MEP Axel Voss’ Compromise Amendment Proposal

Liability: The proposal implies that “online content sharing service providers perform an act of communication to the public” (§1). As a result platforms would be primarily liable for the content on their platforms, except if they conclude licences with rightholders, which would only cover uploads by users for non-commercial purposes (§2). Although his proposal no longer references the need to apply measures, in practice this leaves platforms no other choice than to apply filtering mechanisms, coupled with terms and conditions allowing them to arbitrary remove content, in order to avoid any liability risks.

The proposal implies that “online content sharing service providers perform an act of communication to the public” (§1). As a result platforms would be primarily liable for the content on their platforms, except if they conclude licences with rightholders, which would only cover uploads by users for non-commercial purposes (§2). Although his proposal no longer references the need to apply measures, in practice this leaves platforms no other choice than to apply filtering mechanisms, coupled with terms and conditions allowing them to arbitrary remove content, in order to avoid any liability risks. User protections: All user protection measures have been stripped.

MEP Jean-Marie Cavada’s Compromise Amendment Proposal

Liability: The proposal starts with exactly the same premise as the Voss text, namely that OCSSP perform an act of communication to the public = are directly liable for the content uploaded by their users (-§1a). The only way to avoid this is, as with Voss, to conclude licences with everything you can think of and filter, as rightholders will not be willing to licence everything (e.g. full length movies) or can’t always be found easily. This liability is absolute, applies as soon as a service falls under the OCSSP definition to the entire service, regardless if there are any infringing uploads or not. It remains in place even if no licencing agreement could be concluded (§1b), so even if all the steps are taken by a platform to comply, it remains liable (unlike the Council proposal). In other words: damned if you do, damned if you don’t.

The proposal starts with exactly the same premise as the Voss text, namely that OCSSP perform an act of communication to the public = are directly liable for the content uploaded by their users (-§1a). The only way to avoid this is, as with Voss, to conclude licences with everything you can think of and filter, as rightholders will not be willing to licence everything (e.g. full length movies) or can’t always be found easily. This liability is absolute, applies as soon as a service falls under the OCSSP definition to the entire service, regardless if there are any infringing uploads or not. It remains in place even if no licencing agreement could be concluded (§1b), so even if all the steps are taken by a platform to comply, it remains liable (unlike the Council proposal). In other words: damned if you do, damned if you don’t. Requesting rightholder: the idea would have been valid if the liability regime had not been set from the start as all encompassing. In other words, even if a rightholder does not request a licence, the fact that the OCSSP performs an act of communication to the public renders it liable anyway.

the idea would have been valid if the liability regime had not been set from the start as all encompassing. In other words, even if a rightholder does not request a licence, the fact that the OCSSP performs an act of communication to the public renders it liable anyway. Scope: all rightholders are covered, which means also press publishers under Article 11. Filtering news snippets and potentially links under certain circumstances is an unfathomable task.

all rightholders are covered, which means also press publishers under Article 11. Filtering news snippets and potentially links under certain circumstances is an unfathomable task. Carve-outs (-§1a) : only non-commercial users would be covered by licence agreements.

: only non-commercial users would be covered by licence agreements. Ensure the non-availability of (§1b): this creates a strict obligation (not “reasonable measures” as in e.g. UPC Telekabel) for services to make sure no infringing content is ever shown on their platform. It remains an ex-ante obligation – make sure the content does not even appear.

this creates a strict obligation (not “reasonable measures” as in e.g. UPC Telekabel) for services to make sure no infringing content is ever shown on their platform. It remains an ex-ante obligation – make sure the content does not even appear. or remove expeditiously from their services works or other subject matter identified by rightholders (§1b): the fake notice proposed here applies to a notification of works, not of infringing uses of these works. In other words, this is simply a rightholder handing over its catalogues to a platform and telling them to make sure it doesn’t appear on their platform, which goes against the requirement set by case law that a notice should be precise and adequately substantiated. These criteria even appear in the European Commission’s recent Communication and Recommendation on Fighting Illegal Content Online.

the fake notice proposed here applies to a notification of works, not of infringing uses of these works. In other words, this is simply a rightholder handing over its catalogues to a platform and telling them to make sure it doesn’t appear on their platform, which goes against the requirement set by case law that a notice should be precise and adequately substantiated. These criteria even appear in the European Commission’s recent Communication and Recommendation on Fighting Illegal Content Online. So-called user safeguards: with the setting above, the user safeguards will be useless in practice as OCSSPs are liable for all content on their platforms and for the efficiency of the measures taken. This implies not only filters but overly zealous ones. Stating that those measures should not filter or block legitimate, non-infringing uploads is a nice touch but is naïve at best. No mechanism of counter-notices for users is foreseen to give some protection to their freedom of speech and curb abusive notices. Similarly, no measures are foreseen against rightholders abusing this system.

with the setting above, the user safeguards will be useless in practice as OCSSPs are liable for all content on their platforms and for the efficiency of the measures taken. This implies not only filters but overly zealous ones. Stating that those measures should not filter or block legitimate, non-infringing uploads is a nice touch but is naïve at best. No mechanism of counter-notices for users is foreseen to give some protection to their freedom of speech and curb abusive notices. Similarly, no measures are foreseen against rightholders abusing this system. Transparency: the public, users or public authorities are not entitled to any transparency: only rightholders.

the public, users or public authorities are not entitled to any transparency: only rightholders. Measures: these are not specified and may vary from Member State to Member State, to the detriment of any hope of online single market.

these are not specified and may vary from Member State to Member State, to the detriment of any hope of online single market. Human review (§2) : aside from implying an ever higher compliance costs, the recent cases of ‘human moderation’ by private companies show the limits. In a democracy, human review of legal infringements should be left to the courts.

: aside from implying an ever higher compliance costs, the recent cases of ‘human moderation’ by private companies show the limits. In a democracy, human review of legal infringements should be left to the courts. Member states intervention: there is a likelihood of further fragmentation of the single market as each Member State invents its own flavour of Article 13, a fragmentation which will be harder to cope with for smaller players.

Article 11 – Ancillary Copyright for Press Publishers (also referred to as the ‘Link Tax’)

MEP Axel Voss’ Compromise Amendment Proposal