The New Zealand mosque shooter leveraged social media channels to spread both a race-hatred manifesto and a horrifying live video of his killings, throwing a harsh light on online platforms' continuing role in propagating extremist violence.

Why it matters: Facebook, YouTube and the internet itself are inextricably bound up both with how the Christchurch killer seems to have arrived at his extremist views and with how he decided to act on them.

Details:

The shooter's 17-minute Facebook Live video, shot from a head-mounted camera, appears to have been taken down soon after it was posted. But versions of it continued to crop up on YouTube and Twitter for hours afterwards, often autoplaying on visitors' screens.

The video's perspective put viewers in the shooter's shoes in the manner of a first-person shooter game, but with the sickening awareness that it was real — and that it documented the murder of at least 49 people.

The killer's manifesto referenced white-supremacist memes and themes that have long circulated in far-right discussion spaces.

The whole operation seemed to have been "engineered for maximum virality," as Charlie Warzel put it in The New York Times.

What they're saying:

The New Zealand killer's media tactics represent a kind of white-supremacist mirror image of the approach ISIS crafted to spread its cause, NBC's Ben Collins pointed out.

Peter Kafka in Recode: "The platforms ... did exactly what they’re designed to do: allow humans to share whatever they want, whenever they want, to as many people as they want."

Be smart: Critics widely agreed that Facebook, YouTube, Twitter and other online platform operators have failed to rein in hate speech and violent extremism, but they differed on what the companies could be doing better.

Some argued that the platforms have simply become too vast to ever police properly.

Others, pointing to the online industry's relative success at limiting the distribution of copyrighted video content, maintained that the companies just haven't been given strong enough legal and financial incentives to do better.

Our thought bubble, per Axios' Sara Fischer: A Department of Homeland Security report a decade ago warned that right-wing extremism was a growing danger in the U.S., but criticism from conservatives blocked any action.

Since then, social media platforms have grown into powerful new forces for radicalization.

Removing videos of atrocities after the event won't solve the problem as long as the online pipeline of new extremists keeps refilling.

The big picture: Ironically, the New Zealand story overshadowed an announcement from Facebook Friday that the social network has put a new system in place to police the problem of "revenge porn" postings.

Facebook says it now has 30,000 people working to moderate content and is building new AI-powered tools as well — but neither effort looks likely to protect it from some measure of blame for today's atrocity and those to come.

Under Mark Zuckerberg's recently announced plan for Facebook to favor more private, encrypted messaging, content like the New Zealand shooting video might propagate less widely and quickly. But it might end up being harder to root out more thoroughly.

The other side: Meanwhile, social media services continue to provide powerful organizing tools for causes that many users support, like Friday's worldwide youth protests on climate change.

The bottom line: No one has figured out how to prevent this from happening again.