Facebook has published new documents detailing the company’s plans for a content oversight board, which would serve as a kind of Supreme Court for the platform. In a call with reporters on Tuesday, Facebook said that it hopes to have the board fully staffed by the end of this year, and gave more details on how the board will operate and be governed.

Last November, Mark Zuckerberg penned a blog post laying out his plan to create virtually a Supreme Court for Facebook. Once it’s fully staffed, the body will be in charge of adjudicating appeals from users whose content has been removed from Facebook’s platforms. It will also make judgements on cases referred to it by the company itself.

“Facebook doesn’t have the ultimate power over their expression”

“The board will be an advocate for our community — supporting people’s right to free expression, and making sure we fulfill our responsibility to keep people safe,” Mark Zuckerberg wrote in a blog post. “As an independent organization, we hope it gives people confidence that their views will be heard, and that Facebook doesn’t have the ultimate power over their expression.”

Facebook has pledged that the oversight body will be operational by November of next year. Today, the company laid out how members would be chosen and how they would influence moderation on the company’s platforms.

According to the charter Facebook released Tuesday, the oversight board will begin with at least 11 members and is “likely to be 40 members” when fully staffed. Each board member will serve no more than nine years, divided into three-year terms. The positions will be part-time, although the board will be served by a full-time staff to review submissions and conduct research. Members’ names and moderation decisions will also be made available in a public online database.

However, the board may also be split into different “panels” to focus on more specific content assignments. Board members may remain anonymous if their safety is a concern in a specific case.

Facebook says that it hopes to fill the oversight board with people from a variety of different backgrounds. “There’s going to be a set of people who serve on this board who make different people within that group uncomfortable,” Facebook’s director of governance and global affairs Brent Harris told reporters. “We believe that in building the board and constituting the board and truly representing diversity in the composition of this institution, that this is actually going to be a feature.”

“This is actually going to be a feature”

The charter specifies a range of requirements for oversight board members, akin to the requirements for a corporate board. “Members must not have actual or perceived conflicts of interest that could compromise their independent judgment and decision-making,” the document reads. They must also have demonstrated “familiarity with matters relating to digital content and governance, including free expression, civic discourse, safety, privacy and technology.”

Once the board is in place, content cases will be submitted both by Facebook users and the company itself, with the board given the final say on which cases to hear. “In its selection, the board will seek to consider cases that have the greatest potential to guide future decisions and policies,” the charter says.

To submit a complaint, a user must have first exhausted all appeals in the moderation system Facebook already has in place. The charter envisions users submitting written statements in order to argue their specific moderation cases. Further decisions, like whether users could testify in person, are still up for consideration by the future board.

“The board’s decision will be binding, even if I or anyone at Facebook disagrees with it,” Mark Zuckerberg said in a blog post Tuesday. “The board will use our values to inform its decisions and explain its reasoning openly and in a way that protects people’s privacy.”

Although the board decisions will be made publicly available, the details of the public disclosure are still unclear. Facebook said it had not yet determined how to balance the privacy of users whose content complaints are heard by the board and the transparency of the board’s decisions.