Posted at 18:37 on 18 Apr 2017 by Pandora / Blake

Since the Digital Economy Bill passed to the House of Lords a few months ago, I’ve been following its progress closely. I’ve also been doing my best to intervene in the amendment of the Bill by lobbying the Lords - specifically, sending out a briefing on behalf of Backlash after the second debate, before the Bill was discussed in committee. Each time any transcripts have been published, I’ve read them - and started writing blogposts about each stage of the debate. But I’m not a lawyer, and the passage of a Bill through Parliament is a dense legislative process. Honestly, it’s taken all of my capacity to read, digest and comprehend the Hansard transcripts; I didn't also manage to write succinct, accessible reports on the changes as they happened.

I'm going to have to ditch those half-written drafts now, because the Lords have voted on their final amendments to the Bill, and passed it back to the House of Commons for approval. Section 3 on age verification for online porn has changed in some significant ways. In theory there is still the opportunity for MPs to disagree with the changes and propose amendments of their own; bills can be passed back and forth between the Houses until agreement is reached. But realistically, with a General Election just having been announced for 8 June, it's very unlikely that there will be time for an extended game of legislative ping pong. It's much more likely that the Bill will be rushed through in wash-up without any further changes. So this draft is probably the final shape of the forthcoming Digital Economy Act 2017.

I’ve spent a couple of days reading up on the Lords committee report and third debate, and I think I understand them as well as I’m going to. So here’s my overview of the final shape of the Digital Economy Bill.

Banned content

The first and biggest news concerns the type of pornographic content which the Bill bans from online distribution. One of my highest priorities has been campaigning against the use of the prohibited content guidelines used by the British Board of Film Classification (BBFC) to determine what can be legally be shown.

The BBFC has been chosen by the government (without a particularly robust or transparent selection process) as the new online porn regulator, and a late amendment in the House of Commons added a clause to the Bill saying that online porn would therefore become subject to the BBFC’s classification guidelines. This was hugely problematic. The BBFC guidelines ban an awful lot of consensual sex acts such as films depicting female ejaculation, vaginal and anal fisting, watersports, face sitting, full bondage with a gag, and BDSM that leaves lasting marks. Absurdly, most of these acts are completely legal to perform - which makes it very strange that merely by placing a camera in front of them, the resulting video becomes an illegal representation.

So the list of acts is questionable, and prohibits a lot of female-centric queer sex acts in a way that seems highly prejudicial. But from a legal perspective, the BBFC classification guidelines are a ludicrously out-dated basis for new legislation. That list is based on old CPS guidance on the Obscene Publications Act that is out of whack with current case law. Trials prosecuted under the OPA in the last few years have found films depicting certain of these acts to not be obscene; and yet the BBFC still treats them as such - even though their guidelines are meant to be based on UK obscenity law.

I’ve been actively campaigning against the use of these out of date guidelines - particularly in a Bill that carries sanctions as severe as unilateral blocking of a site from the UK. This was the subject of my Guardian article in November; and I’ve been backed by obscenity lawyer Myles Jackman, Backlash, the Adult Provider Network, and civil liberties organisations, academics, sex educators and sex workers rights campaign groups such as those who joined us at our Kink Olympixxx protest last October.

Well, I’m astonished and delighted to announce that on this point at least, our campaigning has been successful. The final version of the Digital Economy Bill removes the language of “prohibited content” which refers to the BBFC guidelines, and replaces it with “extreme pornographic content” (amendment 25YC). This refers to the definition of extreme pornography enshrined in Section 63 of the Criminal Justice and Immigration Act (CJIA) 2008, AKA the 'extreme porn law'.

This is interesting because the CJIA 2008 created a crime of possession, whereas the Digital Economy Bill criminalises distribution. The Digital Economy Bill makes it illegal for website owners to display banned content, or any over-18 content, without age verification. The extreme porn law on the other hand made it illegal to possess 'extreme porn', and that law is still in force, but the new Bill doesn't also criminalise possession of over-18 content without having age verified. So we have a definition of 'extreme porn' being transferred from a law of possession to a new law covering publication.

Overall, the change is good news. The legal definition of extreme pornography is a shorter list of banned content than the BBFC guidelines. With thanks to Myles Jackman, who had to explain this to me about a hundred times before it started to sink in, here's the criteria an image or video must meet to count as 'extreme porn':

Legal definition of extreme pornography

Firstly, it must be pornographic. The definition of pornography in the CJIA isn't terrible: it's an image "of such a nature that it must reasonably be assumed to have been produced solely or principally for the purpose of sexual arousal". I think that's fair enough. However it's not down to the intention of the creator, but the interpretation of the court.

Secondly, it must be extreme. This means it is (interpreted as being) "grossly offensive, disgusting or otherwise of an obscene character" and it portrays "in an explicit and realistic way" any of the following:

An act threatening a person's life An act which results (or is likely to result) in serious injury to a person's anus, breasts or genitals An act which involves (or appears to involve) sexual interference with a human corpse A person performing (or appearing to perform) an act of intercourse (or oral sex) with an animal (whether dead or alive). An act which involves the non-consensual penetration of a person’s vagina, anus or mouth by another with the other person's penis, a part of the other person’s body, or anything else.

(This fifth clause was added to the definition a couple of years ago, as part of the Criminal Justice and Courts Act 2015.)

With sincere thanks to Myles Jackman who has spoon-fed me much of the following analysis, here's what that means.

There's no problem with 3. and 4., because animals and cadavers can’t consent. But the wording of 1. is worryingly vague. Myles cites the example of an image of himself, naked, smoking a cigarette; such an act might arguably be both pornographic, and portray an act that threatens his life. Guidance on the CJIA gives examples such as "depictions of hanging, suffocation, or sexual assault involving a threat with a weapon", which concerns me because I've had porn censored for depicting rubber LARP swords, under credit card regulations against "threat with a weapon", regardless of the fact that rubber swords are neither deadly nor particularly threatening. Other consensual acts such as asphyxiation, fear play involving knives or guns, are all potentially covered.

2. is likewise problematic; the definition of "serious injury" is seriously woolly. Think of consensual kink play such as needles or genital piercing; the guidance specifically gives "the insertion of sharp objects or the mutilation of breasts or genitals" as an example of porn that would be considered extreme. During the most recent “Obscenity Trial” (R v. Peacock 2012) a colorectal surgeon was summoned to the courtroom to testify that anal fisting was unlikely to result in “serious injury”. Without this specific sort of expert witness, the definition of serious injury becomes highly subjective, and at risk of ignorant speculation by those who have never experienced such acts.

As with the BBFC guidelines, this definition makes no allowance for the consent of the participants. With reference to 1., 2., and 5., both staged/simulated acts, and consensual acts undertaken responsibly by informed, enthusiastic performers, are treated exactly the same as recordings of a non-consensual assault. This erasure of performer consent is emblematic of the problems with the legislative discourse around porn, in which the agency of sex workers is routinely ignored and erased. Treating consent as irrelevant is the sort of attitude that enables and faciltates rape culture. It is a sign of our legislators' extraordinary ignorance of BDSM that images of consensual kink and images of actual sexual assault are treated the same in law.

A couple of final notes on the CJIA definition of extreme porn:

The word "obscene" here is "intended to convey a non-technical definition of that concept, distinct from the technical definition contained in the Obscene Publications Act (OPA) 1959, which is specifically geared to the concept of publication". This is ironic because the inclusion of this definition in the Digital Economy Bill does, in fact, specifically gear it to the concept of publication, thus creating a new and absurd legal contradiction.

This definition does not cover scenes that are contained within a longer work, if the whole work is not interpreted as being solely or principally produced for the purpose of sexual arousal. However, extracts can be considered pornographic if the explicit scene was extracted for the purpose of sexual arousal. For example, the film Casino Royale is not extreme pornography, despite containing a ballbusting torture scene of acts likely to result in serious injury to Bond's genitals; but an extract of that scene placed in a folder called "hot porn" on your computer would be criminal possession under the Criminal Justice and Immigration Act 2008. Likewise, an extract of that scene displayed on a ballbusting porn site, even behind age verification, will be a criminal publication under the forthcoming Digital Economy Act 2017.

So there are a number of issues with the definition of extreme porn that is now being introduced into the Bill. Still, despite these problems, the change from the BBFC definition of prohibited content to the legal definition of extreme pornography is a significant partial victory.

Consensual sex acts which are safer than sexual intercourse from an sexual health perspective, such as female ejaculation, watersports, fisting and bondage, will no longer be banned by the Digital Economy Bill. Porn sites which display such acts no longer risk being blocked at ISP level, as long as they put them behind age verification. This creates a much larger playing field for queer, feminist, fetish and DIY sites to operate. It's a huge campaign win. All those who joined me in organising to have the BBFC guidelines struck from the Bill deserve to be proud.

Definition of pornographic material

A few months ago I highlighted the inclusion of audio only content in the Bills definition of 'pornographic material' as potentially problematic. So I was pleased to note that this has changed with amendment 25YC:

“material” means—(a) a still image or series of still images, with or without sound; or(b) a series of visual images shown as a moving picture, with or without sound.”

Audio only content without any visual component is now no longer mentioned. This makes a huge difference. It means audio porn won’t be subject to age verification, which gives pornographers greater leeway to create multimedia erotica; and more importantly enables us to make our text-based work accessible to visually impaired users without risk of criminal sanction. Text-based porn is not covered by the Digital Economy Bill (although it is covered by the Obscene Publications Act) and doesn't require age verification, so it would have created an ableist double standard for the Bill to criminalise site owners who provide audio recordings of text for visually impaired users. I’m delighted to see audio exempted from age verification in the latest draft of the Bill.

While we're on the subject, I'm going to take the opportunity to mention that Dreams of Spanking has a growing collection of audio spanking erotica available to stream and download - and I recently blogged about the process of creating audio description files for the London Porn Film Festival. And with perfect timing, Girl on the Net has just launched a brand new Patreon fundraiser to make it possible for her to record her filthiest blogposts as audio podcasts. Go, listen, and feast your ears - without fear that these sensory delights are about to be whisked away by government censorship.

User privacy and cyber security

I have to admit, when I read Lord Paddick's proposed amendment 25YN, I got my hopes up. It used the wording drafted by the Open Rights Group to improve the data security and user privacy of age verification solutions. Finally! This amendment solves several of the most egregious problems with age verification: it means that age verification software solutions have to be approved by the regulator, and have to comply with a Code of Practice ensuring the confidentiality of the data of internet users, and making sure that AV solutions process no more data than was necessary and do not compromise cyber security.

Infuriatingly, Lord Ashton of Hyde, the Minister for Culture, Media and Sport, opposed the amendment. He repeated arguments made in Committee that "we do not want the regulator to duplicate the role of the Information Commissioner’s Office" - that's the office responsible for upholding data protection. However, as both Lord Paddick and the Open Rights Group have pointed out, the Data Protection Act is designed for people voluntarily entering personal details online. Age verification for online porn will be mandatory, and since users won't have any choice about whether to enter their data or not, the protections should be more robust.

Baroness Jones of Whitchurch also argued that "the only provision for data protection breaches is for the IOC to be informed, rather than necessarily for it to act". Without this amendment, there is nothing in law obliging the Information Commissioner's Office to act in the event of data breaches resulting from age verification.

The Minister cited the government's draft guidance to the age verification regulator, claiming it already does enough to ensure user privacy. Here's the relevant paragraph from the draft guidance:

“The process of age verifying for adults should be concerned only with the need to establish that the user is aged 18 or above, rather than seeking to identify the user. The privacy of adult users of pornographic sites should be maintained and the potential for fraud or misuse of personal data should be safeguarded … The role of the Regulator should be to focus on the ability of arrangements to verify whether someone is over 18 and should be assured that age verification arrangements will protect a user’s privacy”.

As Lord Paddick replied, this is not reassuring; it is "only guidance" and "does not have teeth at all". The vague wording does not go into enough detail. If an age verification provider did not comply with this advice the regulator would have little power to force them. I can entirely imagine a situation in which an AV provider withheld a user's personal details from the pornographic site they wanted to access, and thereby claimed they were "protecting the user's privacy" - while retaining that data for themselves, mining it or trading it for profit. It is not at all clear that this sort of data retention and data sharing would count as "fraud or misuse".

Since there was no agreement, the amendment was put to a vote in the House of Lords. An exciting moment. Drumroll! Disappointingly, it was voted against by 199 to 74.

So, the Digital Economy Bill contains no requirment that the age verification solutions themselves be regulated. They can retain and share your data, and there’s no limit on how much (or what sort of) data they ask for.

My conclusion? The Lords don’t care about user privacy or cyber security. In the end, despite months of expert campaigning by the Open Rights Group and others, the Bill contains no safeguards against AV providers compiling databases of our porn browing histories which will be deeply vulnerable to leaks, hacks and data breaches.

Independent appeals

Baroness Jones of Whitchurch moved three amendments (25J, 25K and 25P) which would have improved the accountability of the regulator, making the selection process and the appeals process more robust. Unfortunately, like the privacy amendment, these improvements did not make it into the final draft of the Bill.

The BBFC has been nominated by Government as the new 'notification' regulator for online porn. Their job will be to find sites that don’t have age verification in place, and notify them that they need to become compliant. If the site doesn’t comply, the BBFC can then notify any "ancillary service providers” - i.e. billing processors, advertisers, sources of funding, internet hosts and others that facilitate the site serving pornographic content - and ask them to withdraw services from the non-compliant website. If the ancillary service providers also refuse to co-operate, the BBFC's final recourse is to notify Internet Service Providers and get them to refuse to serve the site, so that no-one in the UK will be able to view it (unless they’re using Tor or another proxy/VPN).

It has been said that another regulator will also be chosen, separate from the BBFC, with power to impose financial penalties to non-compliant websites in the UK. Fines are supposed to be the next step up from notifying ancillary service providers, leaving ISP blocking as a last resort. But the “enforcement regulator” hasn’t been designated yet, and no-one knows who it’s going to be.

For the BBFC to take on responsibility for ISP blocking happen represents a significant mission creep of their duties. When they were initially selected they were not originally meant to have any enforcement duties at all; and yet the BBFC is now very close to becoming the sole regulator, removing the need for a separate enforcement body.

The BBFC have previously expressed uncertainty that they had the resources to handle enforcement, as it seemed that the task of notification was aready going to stretch their capacity. (The Open Rights Group's hilarious New Government Jobs campaign riffs on the absurdity of this situation). But following an exchange of letters with the Minister they have agreed to take on the additional task of notifying ISPs. So the BBFC are now handling the lion’s share of the work of enforcement, and the theoretical other regulator has still not been selected.

The draconian penalty of ISP blocking threatens to be liberally applied. In the case of UK websites, fines might provide some incentive to comply, but few site owners in other countries will respect financial penalties imposed on them by the BBFC. It's not even clear that fines will be an option when the regime first comes into effect: I was alarmed by the Minister's statement that "we will continue to consider the appropriate timing for introducing financial penalties for non-compliant providers and decide who the regulator for this will be", which makes it sound an awful lot like age verification could start being enforced by the BBFC with ISP blocking as the main available sanction, without any enforcement regulator or system for financial penalties yet in place.

In terms of the BBFC's powers, it's entirely possible that ancillary service providers based overseas won't respect their authority. Billing companies probably will, but overseas web hosts and advertisers are less likely to co-operate. The Earl of Erroll, who has been working closely with the Digital Policy Alliance throughout the drafting of this Bill, said in the third debate that "I think that the notice and take-down—the blocking—is the only thing that will work. Fines will not work; it is probably a waste of time even trying them." It therefore seems as though ISP blocking is turning from a 'last resort' into the most likely penalty - creating a hugely disproportionate regime of national censorship.

So the noble Baroness' amendment would have achieved several things. Firstly, it would have required there to be two separate regulators, ensuring that the BBFC doesn't end up with more than it can handle, or being judge, jury and executioner of the new system. Secondly, it would have stated that the enforcement regulator must be an independent body separate from the BBFC and from government, and that all appointments to the regulator must be subject to fair and open competition. (I see this as a deserved dig at the somewhat under-the-table way in which the BBFC have been appointed.) Thirdly, it would have ensured that the appeals mechanism "be fully independent and not appointed, overseen and funded by the regulator". As someone whose business lived or died on the result of an independent appeal after a prejudicial verdict by the previous porn regulator, you can see why I was disappointed that this amendment wasn't passed.

Interim report

One amendment that did pass was the requirement that the Secretary of State would issue a report in twelve to eighteen months, assessing the impact and effectiveness of the age verification policy in keeping children safe online, and preventing them from accessing online porn (25YW). It was put to a vote, and passed by a close margin of 179 to 159. It doesn't add a huge amount, but I feel that any opportunity to review the policy gives us a chance to potentially change it. I find it highly likely that the harm done by the new age verification regime will outweigh the good, so having an official process by which its impact and effectiveness will be assessed may force legislators to acknowledge that it isn't achieving what was intended. This might lead to a review of the law and potentially a more sensible, well executed piece of legislation - maybe even one that provides robust privacy safeguards and an independent appeals mechanism. Well, we can but dream.

Code of practice to address online bullying

Last but not least, a bit of a non-sequitur. As an aside to the details surrounding the age verification regime, which aims to impose an abstinence based approach to the difficult topic of young people's relationship with sexual media, some good work was added to the Bill at the last minute regarding online harassment and abuse. Amendment 25YR, moved by Baroness Jones of Whitchurch (with whose contributions I am increasingly impressed) requires a consistent code of practice to be delivered by the Secretary of State to commercial social media platforms within six months of the bill becoming law. The code of practice must require social media platforms to show a duty of care to ensure the safety of a young person using their service; to report and remove illegal posts on social media; to prohibit and remove cyberbullying; and to undertake to work with the education profession and charities to provide children with digital safety skills.

This amendment is the first bit of the Bill that starts to shift the responsibility for "protecting children" away from adult content producers, and onto sites that are used to send abusive messages and post bullying content. It seems right to me that social media platforms should have a legal obligation to take online bullying and abuse seriously, and to delete racist, homophobic, transphobic (etc) content when it is reported.

A lot of the anecdotal problems I hear regarding young people and online porn aren't about young people independently seeking out porn for themselves when they are ready, but rather about sexual images being non-consensually shared, often from young boys to young girls, in a way that makes the recipient feel harassed, violated and uncomfortable.

Slut-shaming is a massive problem faced by many young women, and the sexism of a culture that shames and humiliates girls if they are perceived as sexual (regardless of whether or not this perception is fair - or even whether they consented to sex or were raped), at the same time as tolerating the sexuality of boys and absolving them of sexual crimes, is outrageously unjust. A lot of bullying takes place online, and it can have a devastating effect on young lives. Given the Digital Economy Bill is ostensibly meant to be about young people safe online, I welcomed this amendment. I feel will have a much more concrete impact than attempting to restrict access to pornography.

There’s a world of diference between young people who are experiencing their sexual awakening starting to look for porn that interests them, and young people being sent unwanted sexual comments or content. Stopping young people who want to look at porn won't do much to make them safer - but starting to address the problem of sexual harassment will. Any amendment to the Bill that admits this differentiation is progress, in my book. Directly addressing cyberbullying will do a lot more to keep young people safe than trying to stop interested teenagers from using the internet to explore their sexuality.

This Bill is a ham-fisted attempt to address one of the most difficult problems of our age; that of how to create a social environment in which each person can experience their erotic awakening if, when and how they want to, without being violated, and without shame. I'm enormously interested in this problem, but imposing age restrictions increases shame, stigma and ignorance around sexuality, and does nothing to address the root causes of consent violations. We need to stop trying to brute-force technological solutions to social problems, and start to come up with creative solutions to prevent abuse. Doing more to address cyberbulling and online sexual harassment seems like a good place to start.

I crowd-fund my political work - click here to contribute.

Comments