The UK government is preparing to establish a new internet regulator that would make tech firms liable for content published on their platforms and have the power to sanction companies that fail to take down illegal material and hate speech within hours, BuzzFeed News can reveal.

Under legislation being drafted by the Home Office and the Department for Digital, Culture, Media and Sport (DCMS) due to be announced this winter, a new regulatory framework for online “social harms” would be created.

BuzzFeed News has obtained details of the proposals, which would see the establishment of an internet regulator similar to Ofcom, which regulates broadcasters, telecoms, and postal communications.

Home secretary Sajid Javid and culture secretary Jeremy Wright are considering the introduction of a mandatory code of practice for social media platforms and strict new rules such as "takedown times" forcing websites to remove illegal hate speech within a set timeframe or face penalties.

Ministers are also looking at implementing age verification for users of Facebook, Twitter, and Instagram.

A promise to regulate the internet was buried at the back of the Conservative manifesto for last year's general election. "Some people say that it is not for government to regulate when it comes to technology and the internet," the manifesto stated. "We disagree."

The new proposals are still in the development stage and are due to be put out for consultation later this year. A spokesperson for the government confirmed it is "considering all options", including a regulator.

The planned regulator would have powers to impose punitive sanctions on social media platforms that fail to remove terrorist content, child abuse images, or hate speech, as well as enforcing new regulations on non-illegal content and behaviour online.



The rules for what constitutes non-illegal content will be the subject of what is likely to be a hotly debated consultation. The regulator would be ultimately accountable to parliament.

BuzzFeed News has also been told ministers are looking at creating a second new regulator for online advertising. Its powers would include a crackdown on online advertisements for food and soft drink products that are high in salt, fat, or sugar.

Currently, online advertisements are regulated by the Advertising Standards Authority.

Government sources have indicated that their frustration that the tech industry has failed to take voluntary action to promote online safety has led them to pursue a mandatory approach.

When the previous culture secretary, Matt Hancock, invited 14 tech companies in for talks on online safety earlier this year, only four firms turned up.

Ministers have concluded that the voluntary approach only achieved progress with a few tech giants over specific issues such as terrorist content, and that new laws are required to force smaller and medium-sized social media platforms to take action against a wider range of content.

This will involve producing a single legal framework for internet safety and increasing the legal liability for sites that provide a platform for illegal content.

Social media companies will be forced to sign up to a code of practice and new requirements to assist the police in investigating criminal activity online.

The government is looking at legislation passed in Germany last year requiring social media platforms to remove illegal hate speech within 24 hours or face fines of up to 50 million euros.

The German law was vociferously opposed by human rights groups and industry representatives who warned it would lead to censorship and an unmanageable burden on smaller websites.

It encountered problems in its first few months when a satirical magazine and a political street artist had their content blocked.

The government is set to introduce age verification for social media platforms, after ministers raised concerns that children are currently only required to tick a box saying they are over the age of 13.

DCMS has previously indicated that it would seek to impose mandatory transparency reports on social media platforms and implement the recommendations of the Law Commission review into online communications.

It has also introduced legislation to block pornography sites that refuse to use age verification controls.