Facebook published its Civil Rights Audit on Sunday, a document designed to track its progress in tackling hate speech, abusive content, and disinformation that has festered on its platform.

The audit revealed Facebook still has a proverbial mountain to climb. Already, civil rights groups are demanding transparency in any effort to remedy the problem.

The audit, carried out by the company’s civil rights ambassador Laura Murphy and a team from the civil rights law firm Relman, Dane & Colfax focused on four main areas: content moderation, elections, advertising, and the company’s accountability structure. It was the result of months of interviews with more than 90 civil rights organizations, to determine which “key issues to analyze.”

Several of the groups interviewed, including Muslim Advocates and Color of Change, are members of the Change the Terms coalition, which aims to get Facebook to take a stronger stance against extremist content on its platform.


Other groups in the coalition include Free Press, the Lawyers’ Committee for Civil Rights Under Law, the National Hispanic Media Coalition, the Southern Poverty Law Center (SPLC), and the Center for American Progress. (Editor’s note: ThinkProgress is an editorially independent newsroom housed within the Center for American Progress Action Fund.)

One of the more interesting subsections of the audit is its analysis of Facebook’s approach to white nationalism. The company previously banned white nationalist content in March and has been at pains to note out the changes it made to live-streaming after a far-right extremist used the feature to broadcast a mass shooting in Christchurch, New Zealand, earlier this year.

According to the audit, Facebook’s content moderation doesn’t go far enough.

“The Auditors believe that Facebook’s current white nationalism policy is too narrow because it prohibits only explicit praise, support, or representation of the terms ‘white nationalism’ or ‘white separatism,'” the report read. “The narrow scope of the policy leaves up content that expressly espouses white nationalist ideology without using the term ‘white nationalist.’ As a result, content that would cause the same harm is permitted to remain on the platform.”

The auditors also recommended Facebook further expand its white nationalist policy to ban individuals who currently skirt around the policy by advocating for white nationalism while not using the explicitly banned words.


In a separate statement, Muslim Advocates said that Facebook’s half-hearted attempts to crack down on white nationalist content showed that the company did not take its concerns seriously enough.

“This latest audit update shows that on issues regarding content moderation and the increased threat of white nationalist violence, the company has failed to take meaningful action,” Muslim Advocates said. “The murder of 51 Muslims in Christchurch broadcast all over the world on Facebook made it clear that this is a life and death matter — still, the company has yet to take serious action.”

More broadly, civil rights advocates shared a similar criticism, saying that while some of Facebook’s changes were undoubtedly positive, its slow, incremental approach left significant room for improvement.

Keegan Hankes, SPLC interim research director, described the audit as “too heavy on platitudes and not comprehensive enough,” adding that a comprehensive plan for tackling hate speech on the platform could not be put in place without an “unvarnished look” at Facebook.

“We cannot move forward to protect targeted groups harmed by activity unless we have both an unvarnished look at the cesspools of hate and misinformation growing and spreading on Facebook with the company’s detailed plan for action to be taken on an urgent timeline,” Hankes said. “This update provided the public with neither.”

Hankes’ comments were echoed by Henry Fernandez, senior fellow at the Center for American Progress and member of Change the Terms committee.


“Facebook remains turtle-slow to change. They need to move now to build a diverse team of experts with real authority to oversee ending hate on their platform to get it moving,” he said. “Relying primarily on monthly meetings of executives and a couple of outside consultants with civil rights expertise is a step forward but insufficient.”

Partially in response to the audit, Facebook’s Chief Operating Officer Sheryl Sandberg announced Sunday that the company would be formalizing a civil rights task force so that Facebook could better address content policy, misinformation, and privacy concerns. Facebook also announced that it would be introducing civil rights training for its senior leadership.

“We know these are the first steps to developing long-term accountability,” Sandberg said. “We plan on making further changes to build a culture that explicitly protects and promotes civil rights on Facebook.”