The Electronic Frontier Foundation, Human Rights Watch, and over 70 other groups have asked Facebook to adopt a clearer “due process” system for content takedowns. An open letter to Mark Zuckerberg urges him to enact the Santa Clara Principles, a series of moderation guidelines that academics and nonprofits put forward earlier this year. “Civil society groups around the globe have criticized the way that Facebook’s Community Standards exhibit bias and are unevenly applied across different languages and cultural contexts,” the letter says. “Offering a remedy mechanism, as well as more transparency, will go a long way toward supporting user expression.”

The Santa Clara Principles are a minimum standard for online moderation

The Santa Clara Principles include a clear explanation for why a post or profile violates Facebook’s rules; an accessible appeals system that includes review by a human moderator; and regular transparency reports that detail how much content has been taken down, as well as how many cases were successfully appealed. They’re intended to outline the minimum standards of good platform moderation, encouraging companies to balance the rights of individual users with the need to remove harmful material.

“While Facebook is under enormous — and still mounting — pressure to remove material that is truly threatening, without transparency, fairness, and processes to identify and correct mistakes, Facebook’s content takedown policies too often backfire and silence the very people that should have their voices heard on the platform,” said the EFF in a press release. Other signatories include Article 19, the Center for Democracy and Technology, Ranking Digital Rights, PEN America and Canada, and the American Civil Liberties Union.

Related The secret lives of Facebook moderators in America

Facebook created a limited appeals system earlier this year, allowing users to appeal cases involving nudity, sexual activity, hate speech, and graphic violence. A human will then review the post within 24 hours and respond to the user. It’s promised to expand this system in the future. “This is a positive development, but it doesn’t go far enough,” reads the letter, because “Facebook users are only able to appeal content decisions in a limited set of circumstances, and it is impossible for users to know how pervasive erroneous content takedowns are without increased transparency on Facebook’s part.”

Facebook already has an appeals process, but critics say it doesn’t go far enough

Facebook’s most recent transparency report, which is generally focused on government data requests and copyright takedown notices, included a section on “community standards” moderation. It shows how much content was flagged for violating Facebook’s rules, how much was flagged by AI moderation tools, and how much was ultimately taken down. It does not offer information about how much content was reinstated after appeals.

The letter points out some well-known incidents where Facebook made the wrong call and reversed its decision: blocking posts featuring a historic photo from the Vietnam War, for instance, and a picture of the iconic Little Mermaid statue in Denmark. “Each of these individuals and entities received media attention, were able to reach Facebook staff and, in some cases, receive an apology and have their content restored,” it says. “For most users, content that Facebook removes is rarely restored and some users may be banned from the platform even in the event of an error.”

In a statement to The Verge, Facebook said that “these are very important issues. It’s why we launched an appeals process on Facebook in April, and also published our first transparency report on our effectiveness in removing bad content. We are one of the few companies to do this — and we look forward to doing more in the future.”

Zuckerberg has speculated about some huge changes to how Facebook works, like having an independent organization “almost like a Supreme Court” shape Facebook’s content policies. This letter’s demands aren’t nearly that dramatic — but they would mean creating a more robust system than what Facebook currently has.