An editor watches the live-streamed video that captured the aftermath of the police shooting of Philando Castile. (Agence France-Presse via Getty Images)

You’ve heard of the accidental tourist. Now we have the reluctant news media.

I’m talking about Facebook, Twitter and YouTube, among others. With the advent of live-streaming options — Facebook Live and Periscope, primarily — their already huge influence in the news universe has taken another stunning leap.

When Diamond Reynolds logged on to Facebook after her boyfriend, Philando Castile, was shot by a police officer Wednesday in Falcon Heights, Minn., her first words as she started recording were “Stay with me.” Millions did.

On the strength of that live video, Minnesota’s governor brought in the Justice Department to investigate what might otherwise have gone unquestioned as a justified police action.

I call that news.

But Facebook doesn’t see itself that way, even though two-thirds of its 1.6 billion users get news there — and even though they all now can be citizen journalists with live-broadcast cameras in their pockets.

In a recent blog post, Facebook executive Adam Mosseri reiterated Facebook’s consistent position: “We are not in the business of picking which issues the world should read about. We are in the business of connecting people and ideas — and matching people with the stories they find most meaningful.”

Still, crucial decisions are constantly thrust upon Facebook. And they aren’t too different from those that news editors have always made: Should the newspaper print a photo of an assassinated ambassador? Should a TV network air a terrorist beheading?

Some social-media equivalents: Twitter decided to suspend 125,000 accounts that were associated with recruiting terrorists. YouTube chose to take down, and then put back up, video of Syrian security forces torturing a teenage boy. Reynolds’s video was removed from Facebook for about an hour after it was posted and then restored.

“There’s clearly an editorial process in which Silicon Valley companies are deciding what to put back up,” often in response to protests from viewers, said Zeynep Tufekci of Harvard’s Berkman Center for Internet and Society. This sometimes happens after users have flagged an item as offensive, resulting in its being removed in the first place. She thinks that is what happened with Reynolds’s video. Facebook has called it a technical glitch.

And, Tufekci told me, human involvement is always necessary: “The world’s best robot can’t do it.” (Adrian Chen wrote in Wired about “moderation warehouses,” where thousands of low-paid workers worldwide make these determinations, as they view the worst of human nature.)

Facebook and others undoubtedly are struggling with what their outsize power has wrought.That can’t be easy, as events keep coming ever faster.

It was a big deal last month when Twitter’s Periscope provided live coverage of a congressional sit-in over gun control after C-SPAN cameras were turned off. And only one day after Reynolds’s video rocked the world, Facebook Live captured the scene in Dallas, where a sniper had mowed down police officers.

When Mark Zuckerberg unveiled Facebook Live globally early this year, he spoke (by live video, of course) about its potential. Curiously, he used the word “raw” — and spoke about the opportunity to observe baby bald eagles or a friend’s haircut.

Far more raw, though, was the 10-minute live stream following Castile’s shooting, with the sound of a policeman’s screamed profanity, the clink of handcuffs, the victim’s blood-drenched shirt, and the surreally moving words of a 4-year-old girl comforting her mother: “It’s okay, I’m right here with you.”

These tools, no doubt, can bring great good. They certainly bring great challenges, too, including (as CNN’s Hope King wrote recently) trying to stop criminals and terrorists from live-streaming their deeds.

What’s more, they increasingly put social media companies in a position that traditional news companies have long resisted — becoming an arm of law enforcement in criminal investigations. The traditional press sees itself as a counterweight to government, as the founders intended; social media platforms aren’t having any of that.

Facebook has for years complied with subpoenas, giving courts or law enforcement detailed information about its users — their friends, locations and posts. The company makes no secret of that.

And so, while Facebook may seem to be mostly about your cousin’s Cape Cod vacation, and Twitter may seem to be mostly about where journalists dined in Perugia, there are far bigger issues afoot. It’s not exaggerating to say that, among them, are civil liberties and free speech.

Facebook spokeswoman Christine Chen pointed me Friday to the company’s published “community standards,” and assured me that serious discussions have taken place ever since Facebook Live’s launch: “We’re being thoughtful about this.” She wouldn’t talk about the “technical glitch” that Facebook cited after taking down and restoring Reynolds’s video.

Yes, social media platforms are businesses. They have no obligation to call their offerings “news” or to depict their judgments as editorial decisions. They are free to describe their missions as providing a global town square or creating a more connected globe.

But given their extraordinary influence, they do have an obligation to grapple, as transparently as possible, with extraordinary responsibility.

For more by Margaret Sullivan visit wapo.st/sullivan