Millennials may bear the blame for making $15 avocado toast a thing and killing American cheese, but the proliferation of fake news? Nope, they can thank their boomer parents for that.

Researchers have found that sharing fake news on Facebook during the 2016 election was a “relatively rare activity” — that is, unless you were over 65.

The report, prepared by researchers from New York University’s Social Media and Political Participation Lab and Princeton University, found that on average, users over 65 shared nearly seven times as many articles from fake news domains as the youngest age group did.

“Aside from the overall rarity of the practice, our most robust and consistent finding is that older Americans were more likely to share articles from fake news domains. This relationship holds even when we condition on other factors, such as education, party affiliation, ideological self-placement, and overall posting activity,” the report states.

The study found that 11 percent of people over 65 had shared at least one link to a known fake news website in 2016, compared to just 3 percent of those aged 18-29.

“If seniors are more likely to share fake news than younger people, then there are important implications for how we might design interventions to reduce the spread of fake news,” Andrew Guess, an assistant professor of politics and public affairs at Princeton University, wrote in an emailed statement.

The dissemination of fake news during the 2016 presidential election became headline news after the scale and sophistication of Russia’s disinformation campaign — in particular, the Kremlin’s troll factory — were revealed. Special counsel Robert Mueller has focused a significant part of his probe investigating Moscow’s role in the creation and spread of fake news to help sway the election outcome. In February 2018, Mueller indicted 13 Russians, including an oligarch with close ties to Putin, for their role in tricking Americans into reading and sharing Russian propaganda.

Facebook has already admitted that it “did not do enough” to combat the spread of fake news on its platform during the election, and several other studies have found that it was a primary vector for the spread of fake news in 2016.

Facebook hasn’t helped calm concerned Americans or lawmakers either. The social media giant has faced sustained criticism for its continued failure to address the problem of fake news on its platform throughout 2018, particularly during elections and times of crisis. Though the company made a big show of its U.S. elections-focused "War Room" — which is already shuttered — concerns over the spread of fake news continued in the run-up to the 2018 midterms.

Outside of the U.S., Facebook has failed to prevent outside groups from meddling in Ireland’s abortion referendum, and despite claims it was doing more, it has struggled to contain the spread of disinformation in Myanmar, which has led to violence against Rohingya Muslims.

But in America in 2016, researchers found that sharing fake news on Facebook was still somewhat of a rarity.

“It is important to be clear about how rare this behavior is on social platforms: The vast majority of Facebook users in our data did not share any articles from fake news domains in 2016 at all,” the report states, adding that over 90 percent of respondents shared no stories from fake news publishers.

The study relied on a predefined list of known fake news websites, which the researchers based primarily on a list produced by BuzzFeed.

The study also found that Republicans were far more likely to share fake news than Democrats (18 percent versus 4 percent). But researchers said this might simply be a consequence of the sort of fake news commonly found online. Most fake news produced during the 2016 campaign was pro-Trump or anti-Clinton in orientation, and therefore more likely to be shared by Republicans.

What the research could not ascertain was whether respondents were aware they were sharing links to known fake news domains.

The researchers conducted three surveys between April and November 2016, polling a total of 8,763 Facebook users in the U.S. From that, the company was able to access the private profile information of 1,300 users to track what information they shared with friends.