This article is more than 10 months old

This article is more than 10 months old

The majority of Facebook ads spreading misinformation about vaccines are funded by two organizations run by well-known anti-vaccination activists, a new study in the journal Vaccine has found.

The World Mercury Project chaired by Robert F Kennedy Jr, and Stop Mandatory Vaccinations, a project of campaigner Larry Cook, bought 54% of the anti-vaccine ads shown on the platform during the study period.

How a vaccinated woman's death exposes the threat of anti-vaxxers Read more

“Absolutely we were surprised,” said David Broniatowski, a professor of engineering at George Washington University, one of the authors of the report. “These two individuals were generating the majority of the content.”

Cook uses crowd-funding platforms to raise money for Facebook ads and his personal expenses. The crowd-funding platform GoFundMe banned Cook’s fundraisers in March 2019. YouTube has demonetized Cook’s videos.

Kennedy is the son of the former US attorney general Bobby Kennedy. He also has a nonprofit focused on environmental causes. Kennedy’s brother, sister and niece publicly criticized his “dangerous misinformation” about vaccines in May. They called his work against vaccination, “tragically wrong”.

In fact, vaccines are one of the safest and more effective medical interventions ever developed.

The Vaccine journal study is the first to analyze anti-vaccine ads in Facebook’s advertising archive. The archive is an ad disclosure database Facebook created after the platform was criticized for spreading untraceable misinformation during the Brexit referendum and 2016 US presidential campaign.

Facebook has more than two billion users and roughly 68% of Americans get their news from the platform, the study said. In 2019, the World Health Organization named vaccine hesitancy as one of the world’s top 10 global health threats.

Facebook’s micro-targeting algorithms, unlike television, radio or newspapers, have allowed anti-vaccine groups to home in on individuals who might be susceptible to doubts about vaccines. In particular, women and parents of young children have been targeted by Stop Mandatory Vaccination, and Cook was even censured by the UK Advertising Standards Authority last year.

“Unless you’re in the target audience you’re not going to see an ad, so it’s hard to know what other organizations might be running,” said Emily Lowther, a spokesperson for the Minnesota Hospitals Association, which has had pro-vaccination ads automatically removed from Facebook. It is unclear why Facebook removed the ads.

From our organizational perspective, vaccine misinformation causes real harm to individuals and their communities

“From our organizational perspective, vaccine misinformation causes real harm to individuals and their communities.”

Researchers from George Washington University, Johns Hopkins University and the University of Maryland analyzed more than 500 ads posted between December 2018 and February 2019, when Facebook again updated its vaccine-related ads policies. Of the ads, 163 were pro-vaccine, and 145 promoted alleged harms of vaccination.

While the pro-vaccination messages came from 83 unique organizations within healthcare, 54% of anti-vaccine messages came from just two buyers: the organizations led by Kennedy and Cook.

Anti-vaccine ads also tended to be seen by more people, and to have larger budgets. With up to $499 per ad, anti-vaccine ads “routinely reached audiences between 5,000 and 50,000 people”. Often, they also linked back to products people could buy, including “natural” remedies, books and seminars.

A typical ad run by Stop Mandatory Vaccinations alleges, “Healthy 14 week old infant gets 8 vaccines and dies within 24 h (sic)”.

Researchers also said new Facebook rules established to promote transparency are actually penalizing pro-vaccination ads by hospitals and healthcare providers.

“Facebook is easy enough to game if you can figure out which combination of words will get flagged,” said Nicholas Marcouiller, a digital strategist with Tunheim, which crafts ads for the Minnesota Hospitals Association. “We don’t adjust our messaging to get past them.”

Marcouiller said most of the hospital association’s ads are automatically flagged as politically sensitive, and that social media workers then have to re-submit them to Facebook for human review, a time-consuming process.

“It isn’t surprising that someone who is thinking critically can craft an ad to run on the system,” said Marcouiller. “It’s a luxury Minnesota has a strong enough hospital association to take on that role for our state.”

By contrast, anti-vaccine groups are specialists, Broniatowski said. They post dozens of anti-vaccine ads per year, and are well acquainted with Facebook’s new disclosure requirements.

“Although they are spreading misinformation, they are following the letter of the terms,” said Broniatowski. “This is a situation where the letter of the terms is not consistent with the intent of the terms.”

A Facebook company spokesperson said: “We tackle vaccine misinformation on Facebook by reducing its distribution and connecting people with authoritative information from experts on the topic. We partner with leading public health organizations, such as the World Health Organization, which has publicly identified vaccine hoaxes – if these hoaxes appear on Facebook, we will take action against them – including rejecting ads.”