After a day of criticism over his company’s role in spreading fake news about political candidates, Facebook CEO Mark Zuckerberg rejected the idea that the News Feed had tilted the election in favor of Donald Trump. “Personally I think the idea that fake news on Facebook, which is a very small amount of the content, influenced the election in any way — I think is a pretty crazy idea. Voters make decisions based on their lived experience.”

Zuckerberg was speaking at the Technonomy conference, where interviewer David Kirkpatrick pressed him on Facebook’s growing power as a distributor of news and information. He said people who reacted with shock to Trump’s victory had underestimated his support. “I do think there is a certain profound lack of empathy in asserting that the only reason someone could have voted the way they did is they saw some fake news,” Zuckerberg said. “If you believe that, then I don’t think you have internalized the message the Trump supporters are trying to send in this election.”

“There is a profound lack of empathy in asserting the only reason someone could have voted the way they did is they saw some fake news.”

Zuckerberg suggested that one reason fake news could not have influenced the election is that false and inaccurate articles were posted about Trump and Hillary Clinton. “Why would you think there would be fake news on one side and not the other?” he said. But a BuzzFeed investigation earlier this year found that the top right-wing Facebook news outlets published false or misleading stories 38 percent of the time, compared to 20 percent for top left-wing outlets.

Kirkpatrick also pressed Zuckerberg on whether Facebook created a so-called “filter bubble” — an echo chamber where Hillary supporters only see views from fellow Hillary supporters, and Trump supporters only see views from fellow Trump supporters. “All the research we have suggests that this isn’t really a problem,” he said. Zuckerberg cited a study of 10.1 million politically affiliated Facebook users that the company published in Science last year. It found that liberals and conservatives see about 1 percent less news from the opposing side than they would if Facebook didn't tweak the news feed.

One hard truth that did emerge from the study is that people are simply less likely to click on articles that do not reinforce their previously held beliefs. “I think we would be surprised by how many things that don’t conform to our worldview, we just tune out,” Zuckerberg said. I don’t know what to do about that.”

“I want what we do to have a good impact on the world.”

Still, the study had several flaws, as noted by researcher Zeynep Tufekci and others. Among them: the study was conducted not on a random sample of users, but instead “a small, skewed subset of Facebook users who chose to self-identify their political affiliation on Facebook and regularly log on to Facebook, about ~4% of the population available for the study.” Another: because Facebook’s data is inherently private, it is difficult if not impossible for researchers to do additional, independent work on the problem. (Still, the Wall Street Journal recently found significant differences in the News Feed content distributed to liberals and conservatives.)

Zuckerberg said he is deeply concerned about how Facebook could affect democracy, and said there were (unspecified) things the company could do better in the future to improve the way it distributes news. “I really care about this. I want what we do to have a good impact on the world. I want people to have a diversity of information.”