For indispensable reporting on the coronavirus crisis, the election, and more, subscribe to the Mother Jones Daily newsletter.





Vaccine denial is dangerous. We know this for many reasons, but just consider one of them: In California in 2010, 10 children died in a whooping cough outbreak that was later linked, in part, to the presence of 39 separate clusters of unvaccinated children in the state. It’s that simple: When too many children go unvaccinated, vaccine-preventable diseases spread more easily, and sometimes children die. Nonetheless, as scientifically unfounded fears about childhood vaccines causing autism have proliferated over the past decade or more, a minority of parents are turning to “personal belief exemptions,” so-called “alternative vaccine schedules,” and other ways to dodge or delay vaccinating their kids.

So as a rational person, you might think it would be of the utmost importance to try to talk some sense into these people. But there’s a problem: According to a major new study in the journal Pediatrics, trying to do so may actually make the problem worse. The paper tested the effectiveness of four separate pro-vaccine messages, three of which were based very closely on how the Centers for Disease Control and Prevention (CDC) itself talks about vaccines. The results can only be called grim: Not a single one of the messages was successful when it came to increasing parents’ professed intent to vaccinate their children. And in several cases the messages actually backfired, either increasing the ill-founded belief that vaccines cause autism or even, in one case, apparently reducing parents’ intent to vaccinate.

The study, by political scientist Brendan Nyhan of Dartmouth College* and three colleagues, adds to a large body of frustrating research on how hard it is to correct false information and get people to accept indisputable facts. Nyhan and one of his coauthors, Jason Reifler of the University of Exeter in the United Kingdom, are actually the coauthors of a much discussed previous study showing that when politically conservative test subjects read a fake newspaper article containing a quotation of George W. Bush asserting that Iraq had weapons of mass destruction, followed by a factual correction stating that this was not actually true, they believed Bush’s falsehood more strongly afterwards—an outcome that Nyhan and Reifler dubbed a “backfire effect.”

Unfortunately, the vaccine issue is prime terrain for such biased and motivated reasoning; recent research even suggests that a conspiratorial, paranoid mindset prevails among some vaccine rejectionists. To try to figure out how to persuade them, in the new study researchers surveyed a representative sample of 1,759 Americans with at least one child living in their home. A first phase of the study determined their beliefs about vaccines; then, in a follow-up, respondents were asked to consider one of four messages (or a control message) about vaccine effectiveness and the importance of kids getting the MMR (measles, mumps, rubella) vaccine.

The first message, dubbed “Autism correction,” was a factual, science-heavy correction of false claims that the MMR vaccine causes autism, assuring parents that the vaccine is “safe and effective” and citing multiple studies that disprove claims of an autism link. The second message, dubbed “Disease risks,” simply listed the many risks of contracting the measles, the mumps, or rubella, describing the nasty complications that can come with these diseases. The third message, dubbed “Disease narrative,” told a “true story” about a 10-month-old whose temperature shot up to a terrifying 106 degrees after he contracted measles from another child in a pediatrician’s waiting room.

All three of these messages are closely based on messages (here, here, and here) that appear on the CDC website. And then there was a final message that was not directly based on CDC communications, dubbed “Disease images.” In this case, as a way of emphasizing the importance of vaccines, test subjects were asked to examine three fairly disturbing images of children afflicted with measles, mumps, and rubella. One of those images used is at right.

The results showed that by far, the least successful messages were “Disease narrative” and “Disease images.” Hearing the frightening narrative actually increased respondents’ likelihood of thinking that getting the MMR vaccine will cause serious side effects, from 7.7 percent to 13.8 percent. Similarly, looking at the disturbing images increased test subjects’ belief that vaccines cause autism. In other words, both of these messages backfired.

Why did that happen? Dartmouth’s Nyhan isn’t sure, but he comments that “if people read about or see sick children, it may be easier to imagine other kinds of health risks to children, including possibly side effects of vaccines that are actually quite rare.” (When it comes to side effects, Nyhan is referring not to autism but to the small minority of cases in which vaccines cause adverse reactions.)

The two more straightforward text-only messages, “Austism correction” and “Disease risks,” had more mixed effects. “Disease risks” didn’t cause any harm, but it didn’t really produce any benefits either.

As for “Autism correction,” it actually worked, among survey respondents as a whole, to somewhat reduce belief in the falsehood that vaccines cause autism. But at the same time, the message had an unexpected negative effect, decreasing the percentage of parents saying that they would be likely to vaccinate their children.

Looking more closely, the researchers found that this occurred because of a strong backfire effect among the minority of test subjects who were the most distrustful of vaccines. In this group, the likelihood of saying they would give their kids the MMR vaccine decreased to 45 percent (versus 70 percent in the control group) after they received factual, scientific information debunking the vaccines-autism link. Indeed, the study therefore concluded that “no intervention increased intent to vaccinate among parents who are the least favorable toward vaccines.”

Nyhan carefully emphasizes that the study cannot say anything about the effectiveness of other possible messages beyond the ones that were tested. So there may be winners out there that simply weren’t in the experiment—although as Nyhan added, “I don’t have a good candidate.” In any event, given results like these, any new messages ought to be tested as well.

“I don’t think our results imply that they shouldn’t communicate why vaccines are a good idea,” adds Nyhan. “But they do suggest that we should be more careful to test the messages that we use, and to question the intuition that countering misinformation is likely to be the most effective strategy.”

Finally, Nyhan adds that in order to protect public health by encouraging widespread vaccinations, public communication efforts aren’t the only tools at our disposal. “Other policy measures might be more effective,” he notes. For instance, recently we reported on how easy it is for parents to dodge getting their kids vaccinated in some states; in some cases, it requires little more than a onetime signature on a form. Tightening these policies might be considerably more helpful than trying to win hearts and minds. That wasn’t really working out anyway, and thanks to the new study, we now know that vaccine deniers’ imperviousness to facts may be a key part of the reason why.

* This article previously referred to Dartmouth College as Dartmouth University. We regret the error.