New laws proposed to tackle social media companies streaming child abuse, extremism, terrorist attacks and cyberbullying have been welcomed by senior police and children’s charities.

Launched on Monday, the Online Harms white paper outlines what the government says are tough new laws for internet companies and the ability to enforce them.

The white paper, which was revealed in the Guardian last week, will legislate for a new statutory duty of care by social media firms and the appointment of an independent regulator, which is likely to be funded through a levy on the companies.

Social media bosses could be liable for harmful content, leaked UK plan reveals Read more

The “harms” that companies could be penalised for include failure to act to take down child abuse, terrorist acts and revenge pornography, as well as behaviours such as cyberbullying, spreading disinformation and encouraging self-harm. Senior social media executives could be held personally liable for failure to remove such content from their platforms.

Facebook Twitter Pinterest The charity Barnardo’s says children are facing growing risks online. Photograph: Peter Byrne/PA

In the past 15 years, reports of child abuse online have risen from 110,000 globally in 2004 to 18.4m last year.

Speaking before the white paper’s launch on Monday, Jeremy Wright, the secretary of state for digital, culture, media and sport, said the white paper would not descend into “North Korean-style censorship”, even though the government planned to introduce new powers to allow regulators to block companies from operating in the UK.

Speaking on BBC Radio 4’s Today programme, Wright said censorship was not the goal of the white paper. Saying he understood the concern, he added: “First of all, this regulator must be … independent of government, able to exercise its own judgment.

“Secondly, it will be a regulator that will be expected to take into account freedom of speech, privacy, the promotion of innovation, in addition to the need to reduce harm, and it will have to balance all of those things in its judgments.

“But the third thing to say is that we are not interested here in journalists’ content. We are not interested in what journalists write. And of course, what journalists say on the radio … what they write in newspapers, what they say on television, is controlled in other ways.

“What we are talking about here is user-generated content. What people put online and companies that facilitate access to that kind of material. So this is not about journalism, this is about an unregulated space that we need to control better, to keep people safer.”

Rob Jones, a National Crime Agency director, said: “Industry does some great work but it has lots more to do and the technology already exists to design out a lot of preventable offending. Industry must block abuse images upon detection and prevent online grooming; it must work with us to stop livestreaming of child abuse; it must be more open and share best practice. And abuse sites must no longer be supported by advertising.”

Javed Khan, the chief executive of Barnado’s, said two-thirds of vulnerable children supported through the charity’s child exploitation services were groomed online before meeting their abuser in person.

“Children in the UK are facing growing risks online – from cyberbullying to sexual grooming to gaming addiction,” he said.

“Barnardo’s has long called for new laws to protect children online, just as we do offline, so they can learn, play and communicate safely. The government’s announcement is a very important step in the right direction.”

The chief executive of the NSPCC, Peter Wanless said: “For too long social networks have failed to prioritise children’s safety and left them exposed to grooming, abuse, and harmful content. So it’s high time they were forced to act through this legally binding duty to protect children, backed up with hefty punishments if they fail to do so.

“We are pleased that the government has listened to the NSPCC’s detailed proposals and we are grateful to all those who supported our campaign.”

In a joint foreword to the white paper, Wright and the home secretary, Sajid Javid, said it was time to move beyond self-regulation.

“Voluntary actions from industry to tackle online harms have not been applied consistently or gone far enough. Tech can be an incredible force for good and we want the sector to be part of the solution in protecting their users. However, those that fail to do this will face tough action,” Wright said.

The death of 14-year-old Molly Russell in 2017 has had a big impact on the white paper. Her father launched a passionate campaign this year to highlight the fact that self-harm and suicide were widely promoted on Instagram, a fact that he felt contributed to her killing herself.

The Christchurch shootings in March also added pressure on politicians to act. The attacker used Facebook Live to stream the killings in progress, with thousands watching the attack as it occurred and millions more seeing the video as it was uploaded across the internet over the following day.

The white paper comes after the Australian government passed tough legislation this month to tackle the streaming of violent images on social media.

The new laws will apply to any company that allows users to share or discover user-generated content or interact with each other online, including social media platforms, file-hosting sites, public discussion forums, messaging services and search engines.