Following the terrorist attack against two mosques in Christchurch, New Zealand, which left 50 people dead, Facebook says it has blocked 1.2 million videos of the massacre from being uploaded to its platform, and removed 1.5m within the first 24-hours of the attack.

This means, however, that for a period of time 300,000 versions of the 17-minute video were available to watch online for at least a short period of time -estimated to be ten hours – before Facebook pulled them from the network, despite numerous complaints. The Facebook video was viewed 23,000 times in an hour and was only taken down after 239,924 people had seen it.

The attack carried out by suspected gunman Brenton Tarrant was live streamed on Facebook. The disturbing video showed the gunman executing victims with assault-style riles covered in symbols and quotes commonly associated with the white supremacist movement online.

Recommended: Facebook Outage: Millions of Users Unable to Access Apps

Speaking to CNET on Friday, Facebook said it had also removed posts that had praised or shown support for the shootings in the immediate aftermath.

Video social media sharing platform, YouTube, which is owned by Google, said it was also “working vigilantly to remove any violent footage.”

In response to the tweets, UK Home Secretary Sajid Javid replied: “You really need to do more to stop violent extremism being promoted on your platforms. Take some ownership. Enough is enough.”

Although Facebook has recently upped its effort to take down toxic content, having hired a fleet of additional moderators and using Artificial Intelligence to monitor its platform, many have expressed the sentiment that it is still falling short.

Damian Collins, Chairman of the Commons culture committee, commented saying that the attack appeared to be “designed for social media,” and said it demonstrated why there had to be “statutory regulation of the distribution of content online through social networks.”

Collins said: “It’s a viral contagion spread through social media, helped by their algorithms. The firms need to carry out a major audit into who was sharing this film and how it was shared. Groups have deliberately spread it and those accounts should be closed down.”

Nicholas Thompson, editor-in-chief for Wired magazine, said: “the problem is when you connect humanity, the way Facebook has done, the way other tech platforms have done, you get all of humanity and there’s a lot of terrible things that happen and it gets amplified.”

Prior to carrying out the attack, the gunman appears to have posted a 73-page manifesto on 8chan, an anonymous sharing platform that previously has been linked to child pornography, with the goal of reaching a wider audience.

Recommended: MPs Call for Tax on Social Media Firms to Research Mental Health Impact

Prime Minister Theresa May, said the Government expected tech firms to “act more quickly to remove terrorist content.”

A spokesman for the PM said: “There should be no safe spaces for terrorists to promote and share their extreme views and radicalise others.

May, who sent her “deepest condolences” to New Zealand said: “There can be no place in our societies for the vile ideology that drives and incite hatred and fear.”

Like this: Like Loading...