There are legitimate concerns about bias within social media. Tweets are hidden, innocuous content is flagged, and people are banned and de-platformed for arbitrary reasons. All of these instances happen almost exclusively to conservatives. At the same time, there are also issues with posts with graphic violence. And by that, I mean people live streaming murders. The New Zealand mosque attack was livestreamed on Facebook and the link was up for quite a bit until it was taking down. The harrowing video, which was roughly a half-hour long, showed a white supremacist gunning down scores of Muslims. So, how can these tech giants be more reactive to such posts? Well, they’re creating something that will certainly have conservatives on edge: a supreme court (via WaPo) [emphasis mine]:

Should Facebook take down a doctored video of Nancy Pelosi? Ban a conspiracy theorist like Alex Jones?

These are the kind of content moderation quandaries that have been vexing the world's largest social network, and after years of controversies and missteps, the company says it can't make these decisions alone. That's why Facebook has been building a “Supreme Court” of independent experts that would weigh in on the company’s toughest content moderation decisions — and it’s hope is that it will one day govern decisions across Silicon Valley.

“It's just going to impact our platforms, but the hope absolutely is that at some point this is going to be an industry-wide body,” said Facebook public policy manager Shaarik Zafar at a panel on free expression online yesterday at the New America Foundation. “At that point you would have some type of consistency across platforms.”

Facebook’s plans highlight that without government regulation, technology companies have been left to develop their own solutions to policing troublesome content. Critics say the companies waited too long to address many issues related to harmful content, and the independent oversight board is one way that Facebook is going on the offensive after broad public backlash.

[…]

Francella Ochillo, a digital rights advocate and executive director of the nonprofit Next Century Cities, says that industry leaders — not policymakers — are going to have to develop best practices for themselves.

“Truthfully by the time everyone gets their hands around the issue of the day, we’ve already moved on to a new issue,” Ochillo said on the same panel. “Right now I don’t think the government has the expertise or capacity.”

She also said she didn’t know that government regulation would help address online free expression problems.

That’s why creating an industry-led body might be one solution. Facebook says the oversight board will operate with complete independence, and Zafar said the company plans to follow the recommendations of the board even when it disagrees with them.

But Facebook's promises have been met with skepticism from critics who see the board as a public relations move or a means for Facebook to shirk responsibility from making tough decisions about content it hosts.

Facebook recently asked other tech policy experts to weigh in on its plans.