Facebook CEO Mark Zuckerberg on Thursday defended the social network’s role in the U.S. presidential election. False news stories that were shared hundreds of thousands of times on the network, including claims that the Pope had endorsed Donald Trump and that Hillary Clinton would be arrested on charges related to her private email server, “surely had no impact” on the election, he said, speaking at the Techonomy conference.

“Voters make decisions based on their lived experience,” Zuckerberg went on. The notion that fake news stories on Facebook “influenced the election in any way,” he added, “is a pretty crazy idea.”

In an extended on-stage interview with David Kirkpatrick, author of The Facebook Effect, Zuckerberg noted that fabricated stories made up a small fraction of all the content shared on Facebook. And he suggested that the criticism Facebook has received for fueling such falsehoods was rooted in condescension on the part of people who failed to understand Donald Trump’s appeal. “I think there is a certain profound lack of empathy in asserting that the only reason someone could have voted the way they did is because they saw fake news,” Zuckerberg said. “If you believe that, then I don’t think you internalized the message that Trump voters are trying to send in this election.”

Here Kirkpatrick broke in to ask Zuckerberg what that message was. Zuckerberg demurred, suggesting he’d return to that question after he’d finished his thought. He did not.

Zuckerberg suggested that the clincher to his argument was that, to the extent fake news was shared, it must have been shared by Clinton supporters as well as those who backed Trump. “Why would you think there would be fake news on one side and not the other?”

In fact, fake news was shared by both sides, but a BuzzFeed analysis of 1,137 posts by six significant “hyperpartisan” news sources—three conservative and three liberal—found that mostly or partly false stories on the right outnumbered those on the left by a ratio of two to one. BuzzFeed separately reported on a cottage fake-news industry that had sprung up in Macedonia largely around pro-Trump and anti-Clinton content. People who produced the bogus stories said they had tried pro-Clinton content but found that it was less likely to go viral.

Zuckerberg also took a question about whether Facebook might be contributing to the country’s political division by insulating its users in “filter bubbles”—communities of like-minded people who reinforce one another’s biases rather than challenging them. There, too, Zuckerberg found the criticism misplaced. “All the research we have suggests that this isn’t really a problem,” he said. “For whatever reason, we’ve had a really time getting that out.” He cited a Facebook-funded 2015 study that concluded that while Facebook’s news feed does tend to show people information that supports their political views, their own choices about what to read play a greater role. That study was itself criticized by some for soft-pedaling its findings. Social media researcher and writer Zeynep Tufecki rebutted it in some depth here.

Zuckerberg noted that Facebook takes fake news and hoaxes seriously and provides users tools to report them. Despite his view that they played no role in the election, he said Facebook would continue to work to address the problem. He also said Facebook will continue to explore ways to expose users to a diversity of views in their news feeds.