Tech firms could be held liable for extremism and abuse By Jane Wakefield

Technology reporter Published duration 12 December 2017

image copyright Getty Images image caption MPs have previously said that Facebook, Google and Twitter were not doing enough to tackle abuse and extremism

Google, Facebook and Twitter should be held liable for illegal and dangerous content on their platforms, an ethics body has said.

The BBC understands that the Committee on Standards in Public Life (CSPL) will urge the government to introduce new laws on extremist and abusive content.

Prime Minister Theresa May has said companies need to remove extremist content quickly or face fines.

One free speech advocate said it could turn Facebook into a "national censor".

Lord Bew, who chairs the CSPL, told the Times he was normally "allergic" to proposing new legislation but the decision came out of frustration with how the big technology companies were currently addressing the issue.

The suggested legislation forms part of a report into intimidation in public life, due to be published on Wednesday, 13 December.

An earlier parliamentary inquiry, published by the Commons Home Affairs Committee in May, concluded that technology companies were "shamefully far" from taking action to tackle illegal and dangerous content.

Hard break

In response to the news, Jim Killock, executive director of the Open Rights Group, said: "This is an attempt to make [Facebook boss] Mark Zuckerberg a national censor.

"Facebook and Twitter will censor legal material because they are scared of fines.

"They are the worst people to judge right and wrong."

In response to the suggested legislation, Twitter said: "Abuse and harassment have no place on Twitter.

"We're now taking action on 10 times the number of abusive accounts every day compared to the same time last year.

"We also now use technology to limit account functionality or place suspensions on thousands more abusive accounts each day."

Home Office analysis has suggested that so-called Islamic State (IS) shared 27,000 links to extremist content in the first five months of 2017 and that material remained online for an average of 36 hours.

Alongside extremist content, critics are also worried about how social-media companies deal with racist posts, fake news and child sexual abuse content.

Twitter has faced criticism for allowing accounts that appear to openly "promote" paedophilia.

And in a recent speech at Stanford University, former Facebook executive Chamath Palihapitiya revealed that he felt "tremendous guilt" for helping to create tools that were "ripping apart the social fabric of how society works".

He cited an incident in India where hoax messages about kidnappings shared on WhatsApp had led to the lynching of seven innocent people.

He said that his children were not allowed to use Facebook and recommended people took a "hard break" from social media.

Analysis, Rory Cellan-Jones, technology correspondent

It is an idea the technology giants have long fought but which is now gathering a political head of steam - that these powerful businesses are actually media companies rather than neutral technology platforms and should face the regulation imposed on a newspaper group or even a TV station.

But it's a long journey from a committee's report urging action from the prime minister, to framing legislation that would tame the likes of Facebook, Google and Twitter.

Politicians will have to consider some tricky questions:

What constitutes extremist material?

How quickly should it be removed?

Who is to oversee the new regulations?

Is this yet another job for an already over-burdened Ofcom?

Then, there will be opposition from civil liberties groups concerned that in effect it will be Google, Facebook and Twitter deciding what we are allowed to see.

There is also the chance that other countries with more draconian views of what is acceptable online material will take inspiration from any UK laws.

But supporters of tighter regulation will point to Germany's new social-media law, which has seen Facebook hire 500 more staff to spot extremist material and take it down to avoid fines.

Others may think the solution is for the old media to keep up the pressure - three UK newspapers have critical front pages about social media on Tuesday.

Over the past year, both Facebook and YouTube have had to respond to damaging stories about dangerous and extremist content on their platforms - journalists may be more effective than new laws in policing the technology giants.