In keeping with the theme of independence, the project leaders created a process by which they sought guidance from experts in a dense series of meetings, workshops, and conferences. It ran simulations of board considerations. All told, Facebook consulted with more than 2,200 people from 88 countries.

Last year Facebook ran a series of 20 workshops, in places like Singapore, Menlo Park, Brazil, Berlin, Nairobi, and New York City, to take feedback from activists, politicians, nonprofit groups, and even a few journalists. By the time of the New York workshop I attended, Facebook had tentatively drafted a charter, and had suggestions on the bylaws that would dictate the group’s operations. But in our two-day discussion, everything was up for grabs.

One of the longest discussions involved precedent. Facebook handles millions of appeals every year on its content decisions. The board will handle an infinitesimal slice of those, maybe 25 or 30 in its first year—and Facebook is obliged to respect its decisions only in those individual cases. For instance, in our workshop we simulated a board discussion about a Facebook decision to take down a post where a female comedian claimed that “all men are scum.” Facebook considered it hate speech and took it down, and a public controversy ensued. If a board overruled Facebook, the post would be restored. But removing a single post doesn’t tackle the underlying problem that Facebook’s Community Standards were too inflexible by handling hate speech the same, whether it was directed jokingly at a powerful group or employed harshly toward a vulnerable minority.

Ultimately, Facebook came up with a process where the board could suggest, but not force, the company to regard its decisions as precedent for other cases. Members of the board ruling on a case can ask Facebook to change its Content Standards to adhere to its decision more generally. When that happens, Facebook must consider the request but is not obligated to fulfill it. If it doesn’t change its rules, it must post a public explanation why not.

Will Facebook take those recommendations? According to O'Connell, the Community Standards team will examine any requests from the board the same way it routinely considers changes to its rules already. This means forming a committee to study the various alternatives, asking for expert opinion, and then making the decision based not only on what is the right thing to do from a human rights perspective, but what’s feasible. “We would take [the recommendations] incredibly seriously,” he says. But it won’t necessarily implement them. “A lot has to go into really understanding the implications of what might seem like a pretty straightforward decision when you're talking about applying it across billions of posts per day, in the thousands of languages,” says O’Connell. “There's real operational translation, data science, testing work that really has to happen.” Heather Moore agrees that there will definitely be instances where Facebook rejects the board’s recommendations. If these proliferate, people might question whether the company is really committed to oversight.

And if the board considers—and rejects—Facebook’s policy in a case involving lying in political ads, Facebook will truly be under pressure to make the decision a precedent. “That's precisely what the system is designed to do—place excruciating pressure on us to only stick to our policy if we really are absolutely sure that it's the right one to do,” says Clegg. The current justifications for the policy could hardly stand up if its own oversight board judged it as a violation of human rights and dignity.

Another contentious issue at the workshop I attended involved who should sit on the board. Facebook seemed to think it was … people like us, in the room—well-educated, comfortable technocrats or public policy wonks. You can bet that some of the members will come from human-rights backgrounds. Another imperative was that the board be diverse, both culturally and geographically. After considering alternatives, Facebook concluded that board members should work part-time. They will work remotely, meeting in real life no less than twice a year. Their identities will be public, though their work on individual cases will be unsigned, to prevent blowback. Because the board, especially before it reaches its full 40 members, will be so small, this could put pressure on members. If there are only one or two from a given region or culture, are they then charged with representing the millions of people who share those characteristics?