In 1949, HL Mencken famously described Puritanism as ‘the haunting fear that someone, somewhere, might be happy’. Today, a similar terror haunts our supposed betters: the fear that someone, somewhere, might say what they think. This is especially true when it comes to the internet.

At present, in the UK, except for a few television catch-up and video-on-demand services, where there is already some regulatory oversight by Ofcom, what appears on the internet is largely uncontrolled. As in the world of print and other forms of speech, the internet is bound merely by the law of the land. Within the law, you can generally say and post what you like. But perhaps not for long. Back in September last year, head of Ofcom Sharon White made a speech to the Royal Television Society, saying that this level of online freedom would not do at all. There was a need to ‘level the playing field’ between TV and the internet, she said. By this, of course, she meant increasing controls on the internet, not rolling back regulation on broadcast media.

Then, earlier this month, the House of Lords Select Committee on Communications produced a disconcerting report called Regulating in a Digital World. It is low on specifics, high on pious denunciations of Big Tech, and wholly inimical to a free exchange of ideas online. Its aim, the committee says, is to guide the development of internet regulation and address what it sees as the ‘harms’ of the net. Like Ofcom, the Lords committee rejects the idea that the internet is different from other media. It proposes subjecting the net to a new super-regulator called the Digital Authority, which would report to the Cabinet Office. It would have a remit to regulate existing regulators and suggest further extensions to the reach of regulation. These regulations ‘should seek to achieve equivalent outcomes online and offline’. Not only that, there should be ‘accountability’, meaning ‘sanctions’ for non-compliance. These sanctions could apply to organisations, including the ‘third sector, businesses, public and regulatory bodies’, and even just ‘users’.

At one point, the report demands a ‘duty of care’ be imposed on any online services which host or curate user-generated content. This duty of care ‘would have to be upheld by a regulator with a full set of enforcement powers’. It is never made clear what this ‘duty of care’ would look like in practice, but we can infer that it would involve platforms removing content that is judged to be harmful by the regulator. The report suggests this should be enforced by Ofcom. All of this is deeply worrying. Introducing a new internet regulator, and indeed a new super-regulator, would result in increased complexity, expense and bureaucracy. Requiring platforms to police illegal or ‘harmful’ content proactively will undoubtedly make them overly cautious and likely to censor anything that might cause trouble.

The Lords committee report suggests that the spread of restrictions on free speech from one medium to another is both inevitable and desirable. The argument goes that if broadcasting can be regulated to avoid the showing of offensive or harmful content, so too can the internet. But the regulation of broadcasting in the past was arguably justifiable on the basis of limited capacity: there were only so many radio frequencies and television channels to go round. But now the internet has opened things up considerably. Why not remove, rather than add to, restrictions on freedom of expression? The problem is that regulation enthusiasts don’t even understand free speech, let alone support it. The Lords’ report simultaneously expects platforms to ‘respect’ our ‘human right’ to free speech, while calling for content to be censored. And in her speech to the Royal Television Society, Ofcom’s Sharon White claimed, apparently with a straight face, that ‘far from undermining freedom of expression, effective regulation can promote it’. This is the looking-glass world inhabited by our future regulators.