Mark Zuckerberg has changed his tone about Russian interference dramatically over the last year and a half, acknowledging in recent months that Facebook needs to do better, and launching new products to deal with election meddling. But he wasn't entirely wrong in the immediate wake of the election, when he said that “Voters make decisions based on their lived experience.” On Friday, Robert Mueller’s team filed a 37-page indictment against more than a dozen Russian operatives, in the most significant action yet taken by the U.S. government to hold Russia accountable for its disinformation campaign. The indictment reveals new details of the Kremlin’s strategy, laying out the tactics and language that were used to dupe Americans. But it also illustrates how Zuckerberg’s frustration was partly justified. The big unanswered question of the Russia affair is whether Russian intelligence operatives really changed hearts and minds by hijacking Facebook, or simply fueled existing partisan tensions. What we know so far suggests the latter.

It’s impossible to know whether Russia actually tipped the election, which was decided by several tens of thousands of votes in a handful of swing states. What we do know is that as much as Facebook was exploited by hostile actors thanks to its lack of safeguards, it was Americans who made themselves vulnerable to manipulation in the first place. When Russian agents created dueling 2016 Facebook groups and called on their followers to protest each other in the streets, real American turned up to yell at each other outside an Islamic center in Texas. Even the Russians involved were apparently surprised by how easily their targets took the bait. “I created all these pictures and posts, and the Americans believed that it was written by their people,” one of the defendants named in the indictment allegedly gloated in an e-mail to a family member.

Social media certainly facilitated the Russian campaign. As part of Facebook’s charm offensive, Zuckerberg has since offered tangible fixes, including a plan to verify election advertisements and an effort to emphasize friends, family, and Groups. But Americans’ lack of news literacy transcends Facebook, and was created in part by the Internet itself. As news has shifted from print and television outlets to digital versions of those same outlets to information shared on social-media platforms (still the primary source of news for an overwhelming majority of Americans) audiences failed to keep pace; they never learned to vet the news they consume online.

It’s also a problem we’ve created ourselves. As we’ve become increasingly polarized, news outlets have correspondingly adjusted to cater to our tastes, resulting in a media landscape that’s split into separate, non-overlapping universes of conflicting facts—a world in which Fox News and CNN spout theories about the school shooting in Parkland, Florida, that are diametrically opposed. It was this atmosphere that made the U.S. fertile ground for foreign manipulation. As political scientists Jay J. Van Bavel and Andrea Pereira noted in a recent paper, “Partisanship can even alter memory, implicit evaluation, and even perceptual judgment,” fueling an “human attraction to fake and untrustworthy news” that “poses a serious problem for healthy democratic functioning.”

Zuckerberg’s new safeguards fall victim to the same fallacy: they rely on humans, and humans have an uncanny habit, deliberately or not, of figuring out how to game the system. As Buzzfeed’s Katie Notopoulos found when she accidentally made an annoying video take up a semi-permanent position on her friends’ Facebook feeds, social media networks are still designed to amplify viral content—no matter what kind of response that content is generating. Facebook can ensure that advertisers aren’t buying ad space with rubles, but it’s almost impossible to prevent the most controversial content from rising to the top. The more divisive something is, the more we engage.

Part of that responsibility does indeed lie with Facebook—according to digital monitoring site Parse.ly, the platform was exceptionally prone to meddling thanks to its algorithm, which favored more emotional content. But part of it lies with us. Americans have proven only too willing to be ruled by emotion, and to amplify whatever content makes us angriest. That seem unlikely to change before the next election cycle.