Jillian C. York is the EFF's Director for International Freedom of Expression

In a Facebook Live address this afternoon, Mark Zuckerberg shared his views on freedom of expression in our current era. The speech, which addressed the evolution of his views over time, the role of the internet in political discourse, and the role of Facebook in upholding freedom of expression, was 23 minutes of contradictions, unsubstantiated postulations, and a Cliff Notes version of free speech history.

The speech comes at a time when Facebook is under pressure from countless actors, from foreign governments to civil and human rights groups in the United States and abroad, to impose greater restrictions on hateful speech, misinformation, and terrorism on the platform. Such pressure isn’t new, but it certainly is mounting. And until today, Facebook appeared to be cracking under the weight. Given that context, his choice to take a stand for free expression feels significant.

“Giving more people a voice gives power to the powerless and it pushes society to get better over time,” Zuckerberg said.

But, to free expression advocates like me, Zuckerberg’s speech feels like empty words in the absence of any concrete changes to the company’s questionable policies on speech. Just this month, the company announced controversial exceptions to its fact-checking policies and prohibition on hate speech for politicians, effectively creating a separate and higher tier for those whose words have more power to harm than those of ordinary citizens. Facebook’s VP of Global Affairs and Communications Nick Clegg—himself a former politician—stated that he didn’t believe it would be “acceptable to society at large to have a private company … become a self-appointed referee for everything that politicians say.”

In asserting a fresh stance on free expression, Zuckerberg might have, for instance, reconsidered Facebook’s long-criticized “authentic name policy” that puts users around the world at risk of harm, but which the company insists allows for greater civility, despite ample evidence to the contrary. He could have listened more closely to the women and non-binary users, as well as the artist communities of Facebook who have protested the company’s ban on “female nipples” as discriminatory and outdated (in his speech, he called pornography “harmful” but said nothing about nudity). Zuckerberg might have reconsidered the company’s ever-expanding use of AI to adjudicate hate speech, given its clearly negative impact on LGBTQ users. Or, when he was speaking pridefully about how the “Black Lives Matter” hashtag was first mentioned on Facebook, he might have also acknowledged his company’s role in silencing important speech related to the movement.

Facebook has the right to moderate speech as it sees fit, but human rights advocates argue that it does so in a manner that is both inconsistent and opaque. In 2018, a group of academics and free expression advocates (myself included) drafted the Santa Clara Principles on Transparency and Accountability in Content Moderation—a concrete attempt at pushing companies, including Facebook, to provide greater insight to users and the public. The principles demand that companies publish comprehensive transparency reports, provide notification to users when their content is removed, and offer a robust appeals process.

In an open letter to Zuckerberg, the authors—as well as more than a hundred organizations from all over the world—implored Facebook to implement the principles. The company responded somewhat favorably, expanding its appeals process and transparency reporting and initiating conversations with some of the groups involved. Notably absent from the conversation? Zuckerberg. In fact, while he’s reportedly had numerous private dinners with conservative figures with unfounded complaints about the platform’s political biases, there have been no reports of similar meals shared with free expression experts.

Zuckerberg’s speech will surely anger those who seek to push Facebook to censor more speech, but overall—despite his sophomoric understanding of his own company’s role as arbiter—his positions were fairly measured. He clarified that Facebook has chosen to stay out of the Chinese market, reaffirmed the importance of free expression in political ads, and spoke of measuring true harm in attempts to define dangerous speech.

Still, Zuckerberg must decide if Facebook is willing to walk the walk he set out today in his rhetoric. “I believe we have two responsibilities: to remove content when it could cause real danger as effectively as we can, and to fight to uphold as wide a definition of freedom of expression as possible,” he proclaimed, “and not allow the definition of what is considered dangerous to expand beyond what is absolutely necessary … [that’s] what I’m committed to.”