The UK government says it wants to stop people under 18 from looking at pornography, and so it's going to make all the porn sites operating in Britain collect some kind of age-verification in order to make this happen, on pain of being blocked by the UK's Great Firewall.

This amounts to a giant database of the nation's porn-consumption habits, ripe for leaking and abuse. To make things worse, the obvious way to do this is by tying age-verification in with some payment method, which will produce a list of the nation's porn-consumption habits, which you can sort by income.

The Open Rights Group published an extensive critique of this proposal, pointing out all the many, many ways it could go very, very wrong. In response, the UK government referred the group to Draft BSI Security Standard (Draft PAS 1296:2017), a standard apparently designed to prevent under-18s from buying knives.

On Twitter, Alec Muffett live-tweeted a close reading of the standard, which calls for "unique identifiers" to be assigned to each user of the system (Muffett: "THIS IS YOUR GOV'T PORN-BROWSING ID NUMBER") and goes downhill from there, citing 10-year-old statistics to validate its assumptions, acknowledging that databases like the one proposed leak like crazy, and so on.

The standard itself has been taken down by the BSI group, who note, "This draft is no longer available to be viewed. The comments that have been made on it have been collected, and will be considered by the committee responsible for the draft."

Meanwhile, as ORG points out, teens — and anyone else looking to download porn without reporting on it to Her Majesty's Government — will just use proxies to visit sites from non-UK IP addresses.

The Digital Economy Bill's impact on privacy of users should, in human rights law, be properly spelled out ("in accordance with the law") and be designed to minimise the impacts on people (necessary and proportionate). Thus failure to provide protections places the entire system under threat of potential legal challenges. User data in these systems will be especially sensitive, being linked to private sexual preferences and potentially impacting particularly badly on sexual minorities if it goes wrong, through data breaches or simple chilling effects. This data is regarded as particularly sensitive in law. Government, in fact has at its hands a system called Verify which could provide age-verification in a privacy friendly manner. The Government ought to be explaining why the high standards of its own Verify system are not being applied to Age Verification, or indeed, why the government is not prepared to use its own systems to minimise the impacts. As with web filtering, there is no evidence that Age Verification will prevent an even slightly determined teenager from accessing pornography, nor reduce demand for it among young people. The Government appears to be looking for an easy fix to a complex social problem. The Internet has given young people unprecedented access to adult content but it's education rather than tech solutions that are most likely to address problems arising from this. Serious questions about the efficacy and therefore proportionality of this measure remain.

A database of the UK's porn habits. What could possibly go wrong?

[Jim Killock/Open Rights Group]