The UK's Entire Approach To 'Online Harms' Is Backwards... And No One Cares

from the this-is-not-a-good-idea dept

Back in April, the UK (with Theresa May making the announcement) released a plan to fine internet companies if they allowed "online harms" in the form of "abhorrent content." This included "legal" content. As we noted at the time, this seemed to create all sorts of problems. Since then, the UK has been seeking "comments" on this proposal, and many are coming in. However, the most incredible thing is that the UK seems to assume so many things in its plan that the comments it's asking for are basically, "how do we tweak this proposal around the edges," rather than, "should we do this at all?"

Various organizations have been engaging, as they should. However, reading the Center for Democracy & Technology's set of comments to the UK in response to its questions is a really frustrating experience. CDT knows how dumb this plan is. However, the specific questions that the UK government is asking don't even let commenters really lay out the many, many problems with this approach.

And, of course, we just wrote about some new research that suggests a focus on "removing" terrorist content has actually harmed the efforts against terrorism, in large part by hiding from law enforcement and intelligence agencies what's going on. In short, in this moral panic about "online harms", we're effectively sweeping useful evidence under the rug to pretend that if we hide it, nothing bad happens. Instead, the reality is that letting clueless people post information about their dastardly plans online seems to make it much easier to stop those plans from ever being brought to fruition.

But the UK's "online harms" paper and approach doesn't even seem to take that possibility into account -- instead it assumes that it's obviously a good thing to censor this content, and the only questions are really around who has the power to do so and how.

The fact that they don't even seem to be open to the idea that this entire approach may be counterproductive and damaging suggests that the momentum for this proposal is unlikely to be stoppable -- and we're going to end up with a really dangerous, censorial regulation with little concern for all the harm it will cause, even when it regards actual harms like terrorist attacks.

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community. Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis. While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: content moderation, harm, online harms, terrorist content, uk