Last week, Facebook convened a small group of tech and media reporters in its New York headquarters, plied them with shrimp cocktail, and showed them a 12-minute video about the company’s fight against misinformation. The goal of the meeting was to assure journalists, who have shown far less deference toward the massive tech company in the past 18 months, that Facebook’s misinformation problem, which ran rampant during the 2016 election, was under control. Instead, one simple question from a reporter—why does Facebook allow conspiracy theory–peddling Web site Infowars on its platform?—threw the whole gathering into disarray. The meeting leaked into the next day’s news cycle as pundits and reporters questioned whether Facebook could say it had solved a fake-news crisis while allowing known conspiracy theorists to remain on its platform. To Facebook, at least, the answer was still “yes.” on Wednesday, C.E.O. Mark Zuckerberg gave us some of the most revealing insights yet as to why.

In a wide-ranging interview with Kara Swisher, Zuckerberg attempted to make the case for why Facebook allows outlets like Infowars to publish on its multi-billion-person platform, yet can still claim to be winning the war on fake news. By way of example, Zuckerberg brought up Holocaust deniers. The tech founder, who is Jewish, told Swisher that while he finds Holocaust deniers to be “deeply offensive. . . . at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong. I don’t think that they’re intentionally getting it wrong.” Swisher explained that, actually, Holocaust deniers likely are attempting to mislead people. But Zuckerberg demurred, shying away from the responsibility of divining the intent of publishers who distribute conspiracy theory–minded content:

It’s hard to impugn intent and to understand the intent. I just think, as abhorrent as some of those examples are, I think the reality is also that I get things wrong when I speak publicly. I’m sure you do. I’m sure a lot of leaders and public figures we respect do too, and I just don’t think that it is the right thing to say, “We’re going to take someone off the platform if they get things wrong, even multiple times.”

As the dust around the interview settled, Zuckerberg seemed to realize he’d made a monumental mistake. In a follow-up note to Swisher, he clarified his stance. “I personally find Holocaust denial deeply offensive, and I absolutely didn’t intend to defend the intent of people who deny that,” he wrote. “Our goal with fake news is not to prevent anyone from saying something untrue—but to stop fake news and misinformation spreading across our services. If something is spreading and is rated false by fact checkers, it would lose the vast majority of its distribution in News Feed. And of course if a post crossed line into advocating for violence or hate against a particular group, it would be removed.” Then, Facebook issued yet another policy clarification, this time regarding content intended to incite violence. Posts that directly, clearly call for violence have long been banned from the platform, but in light of Facebook posts leading to deadly physical violence in places like Myanmar, Sri Lanka, and India, The New York Times reported that Facebook will remove content that inadvertently leads to violence, too. Per the Times, the company will rely on reports from local public policy employees when deciding whether a post should be deleted.

If Zuckerberg’s initial comments show he’s fundamentally unwilling to grapple with the question of intent, Facebook’s clarification about violence-inducing posts is all the more confusing. It says, in short, that the company is unwilling to be an arbiter of truth, unless that truth begets violence. Yet even the line between fake posts and posts that lead to violence is arbitrary, and constantly in flux. Take Pizzagate, a conspiracy theory popular on Infowars that baselessly claimed Hillary Clinton was running a child-sex-trafficking operation out of the basement of Comet Ping Pong, a D.C. pizza joint. In late 2016, a North Carolina man showed up at Comet with an AR-15 rifle—he later told authorities that he believed a confrontation would sacrifice “the lives of a few for the lives of many.” It’s difficult—and in some cases impossible—to predict whether a post on the Internet will move people to violence. Once again, Zuckerberg has set a standard that raises more questions than it answers.