In mid-September, 2015, Mark Zuckerberg, the chairman and chief executive officer of Facebook, broadcast, from his company’s headquarters in San Francisco, the first live video shown on his social-media platform. He wore a gray T-shirt. He spoke cheerfully about “our community.” He showed off “all kinds of cool stuff” on his desk. “At Facebook, no one has offices,” he said. He then walked into a conference room, where he took some meetings behind closed doors. He pointed to the walls. “It’s all glass,” he said. “We want to create this very open and transparent culture in our company.”

The following spring, Zuckerberg launched Facebook Live, a streaming-video service that made the amateur broadcasting system he had piloted available to all. Facebook Live, Zuckerberg said, would be “like having a TV camera in your pocket. Anyone with a phone now has the power to broadcast to anyone in the world. When you interact live, you feel connected in a more personal way. This is a big shift in how we communicate, and it’s going to create new opportunities for people to come together.”

This week, Facebook is reeling from the latest shock delivered by Facebook Live: an apparent murder recorded in the Glenville area of Cleveland last Sunday and then posted to the platform. More than a thousand people viewed the video before Facebook removed it. In it, a man whom police have identified as Steve Stephens approaches an elderly shopper, apparently at random. Stephens seems to be narrating the events he is recording for the benefit of a woman with whom he is angry; the video is presented as a cruel message to her. Stephens asks the shopper to speak the woman’s name. He then shoots him dead.

Cleveland police identified the victim as Robert Godwin, who was seventy-four. Godwin’s son told reporters that his father was a retired foundry worker who had nine children, fourteen grandchildren, and numerous great-grandchildren.

Sometime after 2 P.M. on Sunday, Stephens went on Facebook Live. He seemed to be speaking on the phone. “I fucked up, man. . . I’m at the point where I snapped . . . I shamed myself. . . And I’m about to keep killing until, until, they catch me. Fuck it. I posted it.” Stephens referred to killing thirteen people; Cleveland police said that they had no evidence of any deaths besides Godwin’s. As of Monday afternoon, the search for Stephens was still under way.*

“This is a horrific crime and we do not allow this kind of content on Facebook,” the company said in an initial statement. The choice of words was poor. An apparent murder staged for social-media distribution is not “content.” Of course, the statement was composed in a rush, under the pressure of events, but it was a reminder of how far from equilibrium Facebook remains in understanding and managing its peculiar role as a near-monopolistic, for-profit public square, as well as a peoples’ broadcaster and news distributor.

It is not a hidden truth that some violent and self-destructive people crave an audience. Broadcast television birthed the theatre of media-age terrorism half a century ago. Khalid Sheikh Mohammed imagined the September 11th attacks as a reality-television producer would—their political power was inseparable in his thinking from the fact that the images would be shown over and over on television. Since then, digital technology has democratized broadcast production—lowered the barriers to entry, as economists would put it. Even the Taliban, which banned cameras and music in its initial phase, now produces and distributes snuff videos of its guerrilla and suicide attacks. If it weren’t for digital production and its potential for worldwide distribution on social media, the Islamic State might be of marginal concern outside of the Arab world.

It is a puzzle, then, as to why Zuckerberg and his team ever believed they could prevent dark souls from hijacking Facebook Live, to their embarrassment. The company has made a fetish of its role as an open, neutral platform—emphatically not a content curator. In fact, Facebook’s content-management practices are ridden with mysteries and contradictions. The algorithms that shape users’ feeds are opaque. The company defends its unwillingness to employ humans to regulate content by pointing to principles of free expression, yet it readily coöperates with law enforcement and makes compromises that are difficult for users to evaluate.

Given its size and character, Facebook cannot, as a practical matter, review all amateur content before it is shared. This, along with the distinctive seduction of Facebook Live’s immediacy and global reach, has reduced the obstacles to violent stunts and horrific crimes such as the one Stephens allegedly carried out on Sunday. And that case is not an isolated one. Earlier this year, Chicago police opened an investigation into a Facebook Live broadcast of the rape of a fifteen-year-old girl, which was watched by at least forty people. Suicides are a recurring problem; just on Sunday, according to Indian police, a man used Facebook Live to broadcast his suicide by hanging.

Facebook has announced efforts to incorporate suicide detection and prevention tools onto its platform—to detect possible attempts on Facebook Live or off-line before they occur. Yet the company reported that it had more than 1.2 billion daily active users worldwide at the end of 2016. Amid such scale and diversity, better software and detection tools might prevent some broadcasted suicides or violence but they cannot possibly stop all of it. How many misses is Facebook willing to accept?

Is the company prepared to consider that Facebook Live might stimulate violence that would otherwise not occur? If this is even a possibility (it would be a difficult research problem), what are Facebook’s ethical duties? It is true that the advent of social media cannot be undone, any more than television could be regulated in a way that would fully prevent terrorists from exploiting it. Yet every corporation is vulnerable—maybe a better word is accountable—when the choices it makes harm others, particularly when the harm occurs in pursuit of profit.

As a system or technology, Facebook Live does not present a radical departure from the platform’s original core of sharing and posting, but it does amplify the stakes, offering an invitation to performance that resonates precisely because, as Zuckerberg put it, “When you interact live, you feel connected in a more personal way.” Facebook Live also exploits the thrill of uncertainty, latent violence, and unpredictability that accompanies live human assemblies, and the broadcasting of those gatherings to live audiences. Yet if public assemblies in the physical world never had police on duty and maintained no requirements for permits, prior review, or the immediate application of the rule of law, they would not necessarily be safe or uplifting. They would, often enough, be bloody, dark, and dangerous.

Charlie Beckett, a former British broadcaster who now runs a journalism think tank at the London School of Economics, wrote a research paper on social media and terrorism that was published last fall. “The platforms are in a difficult place in terms of the competing pressures of corporate self-interest, the demands of their consumers for open access, the public interest involved in supporting good journalism, and fostering secure and cohesive societies,” Beckett wrote. “They are relatively young organizations that have grown quickly, and are still accreting institutional knowledge on these issues.”

That is a fair and restrained assessment, but Facebook cannot expect to plead growing pains or a lack of resources for much longer. At the end of last year, the corporation reported holding almost thirty billion dollars in cash and marketable securities; its annual profit exceeded ten billion dollars for the first time. Facebook can afford to slow down and take on more of the risks associated with curating content—the risks of not doing so being increasingly glaring. Its engineers might, in addition to their habitual writing of improved algorithms, consider the durable oath of a profession that has long wrestled with the kinds of ethical quandaries that arise from innovating in the pursuit of the greater good: first, do no harm.

*Update: There were reports Tuesday that Steve Stephens had killed himself after a police pursuit.