By Rowland Manthorpe, technology correspondent

In the aftermath of the Christchurch shooting, as politicians and tech executives argue how to control the spread of far-right propaganda on social media, one word has cropped up again and again.

It appeared in Theresa May's statement at yesterday's international summit, convened in Paris by Emmanuel Macron and Jacinda Ardern.

With skill, will and unshakable moral courage, Ms Ardern has been campaigning to prevent a repeat of the Christchurch incident, where the livestream captured by the shooter ricocheted around social media, evading the attempts of the platforms to remove it. To that end, she'd gathered Facebook's Nick Clegg and Twitter's Jack Dorsey, as well as political leaders from around the world.

Speaking to them, the prime minister reached back to 2017, to her response to the terrorist attacks in Westminster, Manchester and London Bridge. And she used the word of the moment: "Daesh."


"I called for a much greater co-ordinated global response to fight back against Daesh propaganda online," Mrs May told the audience, explaining that, since then, it had dropped to the level of 2015.

"That," she concluded, "shows us what is possible."

It should. But it doesn't.

Image: Jacinda Ardern and Emmanuel Macron launched the 'Christchurch Call' initiative to tackle the spread of extremism online

This is, on the face of it, surprising, because the success reported by Mrs May is real and significant. As recently as 2015, Daesh - also known as ISIL, Islamic State, or ISIS -- was an unstoppable online propaganda machine. Its material popped up on Facebook, YouTube and Twitter, selling an alluring vision of an Islamically pure state pledged to oppose the hated West.

Those taken in by its message included fifteen-year-old Shamima Begum, who was sucked into the murderous death cult by videos on YouTube. "At first it was nice," she told my colleague John Sparks when they spoke earlier this year, "it was like how they showed it in the videos".

Shamima Begum told Sky she was 'just a housewife' during her time with Islamic State

Of course the reality was horrifically different - but, from inside her filter bubble, Begum would have found it hard to tell.

At first, the tech companies refused to take the issue seriously. Then, the death of American journalist James Foley by beheading prompted them to build up their defences. They invested in AI-based programmes that detected ISIS activity. They used their personalised advertising systems to target people at risk of radicalisation. And, little by little, their efforts were successful.

According to the best numbers we have - which, admittedly, come from the platforms themselves - Facebook now takes down 99.5% of ISIS and al-Qaeda propaganda before anyone reports it. Every three months, it routinely removes around three million pieces of this radicalising content.

The problem hasn't been solved, but it's definitely being tackled.

Image: Armed police took to the streets of Christchurch after the mosque shootings

So why won't the same tools and techniques work against the far-right?

Simply put, because there is no consensus on how unacceptable the far-right is. Whereas ISIS is, for most political actors in the West, entirely beyond the pale, far-right ideology is, if not tolerated, then definitely stomached.

"Some of the things you see on (dark web site) 8Chan are extreme, but lesser forms are repeated by politicians in Eastern Europe and the American right," social media researcher Elliot Jones told me. "If you were to crack down - to say, this is uniformly bad - you might take out some people who are part of the political mainstream."

The issue of accidental bans is crucial. Last month, tech site Motherboard reported that Twitter wasn't taking the same aggressive approach to white supremacist propaganda as it did to ISIS content, because it feared it would end up banning Republican politicians.

Whereas ISIS is, for most political actors in the West, entirely beyond the pale, far right ideology is, if not tolerated, then definitely stomached

Twitter denied the account, but it fits with what we know about how this technology works. Although tech companies like to pretend their algorithms are perfect, in reality any complex system will create plenty of collateral damage. The question is: do we think that damage is worth it?

Conveniently, the tech companies do not include in their figures the number of innocent accounts they flag by mistake - but no doubt there are many. In the case of ISIS content, picking up, say, Arabic language broadcasters, is deemed an acceptable cost. With white supremacy, clearly, it is not.

In other words: much as it might suit politicians and tech execs to pretend otherwise, this is not a technological issue, but one defined by societal norms.

Image: Theresa May called for a 'much greater co-ordinated global response' to ISIS online propaganda in 2017

None of this should let the tech firms off the hook. When it comes to moderation, they are far too happy to drag their feet.

But, with some notable exceptions, governments aren't speaking to them in a clear voice. Until they do, we are unlikely to see any real action.

To see this play out, you only have to look at the Paris summit, whose plan of action was named the Christchurch Call.

Despite being over three pages long, it lacked a single concrete commitment. Governments promised to "counter the drivers of terrorism" and "consider appropriate action". Tech companies said they were "seeking to prevent" the spread of terrorist content.

Most importantly, the document also lacked the signature of the one man with the power to reign in the tech firms: Donald Trump.

In a statement, the White House said it was concerned about the Christchurch Call's compatibility with US law. But surely it's no coincidence that the US president is the most prominent example of the confusing double standard about what can and can't be said on social media, when you are brown, and when you are white.

Sky Views is a series of comment pieces by Sky News editors and correspondents, published every morning.

Previously on Sky Views: Beth Rigby - Will Theresa May be remembered as one of UK's worst prime ministers?