On Wednesday evening, Facebook convened a small group of media and tech reporters at its New York headquarters with its head of News Feed. This product—the core stream of posts shared by pages, publishers, and people’s friends—has been heavily scrutinized in the wake of the 2016 election, during which Facebook, like a handful of other tech platforms, was compromised and manipulated by foreign actors. During its meet-and-greet with reporters, billed as “a presentation about our work to prevent the spread of false news,” Facebook screened a nearly 12-minute movie it released in May called “Facing Facts,” a glossily produced sizzle reel directed by Academy Award-winning documentary filmmaker Morgan Neville. In one scene, a Facebook employee draws four quadrants on the board along an x-axis labeled “TRUTH” and a y-axis labeled “INTENT TO MISLEAD,” an ostensible attempt to grapple with the nuances of news, truth, right, and wrong on the Internet. The employee then gestures to the upper-left quadrant, describing news people share on Facebook that’s low in truth but high in its desire to mislead—in other words, “things that were explicitly designed and architected to be viral,” he explains. “These are the hoaxes of the world. These are things like Pizzagate. This is just false news. We have to get this right if we’re going to regain people’s trust.”

In theory, this is a step in the right direction. Attempting to weed out posts about things like Pizzagate, a conspiracy first proffered by the likes of Infowars’ Alex Jones, would seem to align with Facebook’s stated goal of minimizing the impact of stories intended to mislead. But the company’s commitment to this philosophy was thrown into question immediately after the presentation, when Sara Su, a Facebook News Feed product specialist, and John Hegeman, the chief of Facebook’s News Feed, took a question from CNN reporter Oliver Darcy. Darcy asked how Facebook could allow Infowars, which has a Facebook page with more than 900,000 followers, to continue to operate on its platform. In response, Hegeman told Darcy that Facebook doesn’t “take down false news . . . I guess just for being false that doesn’t violate the community standards,” he went on, adding that Infowars has “not violated something that would result in them being taken down.”

While Infowars’ conspiracy theories “can be really problematic” and “it bugs me too,” said Su, the organization’s page represents a gray area: Facebook is focused on taking down content that “can be proven beyond a doubt to be demonstrably false,” a criterion that, to Facebook’s mind, rules out Infowars, whose posts claim that both the Sandy Hook shooting and 9/11 were hoaxes, and, more recently, that Democrats were going to start a civil war on the Fourth of July. “I think part of the fundamental thing here is that we created Facebook to be a place where different people can have a voice,” Hegeman said. “And different publishers have very different points of view.” In a follow-up statement to Darcy, Facebook again justified the logic behind its Infowars verdict: “We work hard to find the right balance between encouraging free expression and promoting a safe and authentic community, and we believe that down-ranking inauthentic content strikes that balance,” Facebook spokeswoman Lauren Svensson said. “In other words, we allow people to post it as a form of expression, but we’re not going to show it at the top of News Feed.”

Interestingly, this line of reasoning appears totally divorced from one of Facebook’s overarching philosophies. Back in January, C.E.O. Mark Zuckerberg pledged to begin the year with a fresh approach, announcing that he was “changing the goal I give our product teams from focusing on helping you find relevant content to helping you have more meaningful social interactions.” Doing so, he said, would allow the company to hew closer to its original goal: “We built Facebook to help people stay connected and bring us closer together with the people that matter to us,” he wrote. “That’s why we’ve always put friends and family at the core of the experience.” This announcement coincided with a change in Facebook’s News Feed to emphasize posts from personal connections. By leaning on its message of community-building—whether that be through a one-on-one Messenger conversation, a closed Facebook group where people share recipes, or a status update that reaches hundreds or thousands of others—Facebook is arguing that connections on the platform are valuable. It seems counter-intuitive, then, that the social-media giant would make the penalty for spreading misinformation a smaller audience, as opposed to no audience at all—doing so implies that a more intimate community is a punitive measure, which contradicts Zuckerberg’s January overtures.