Depressing. From Maria Konnikova at the New Yorker:

[In April,] Brendan Nyhan, a professor of political science at Dartmouth, published the results of a study that he and a team of pediatricians and political scientists had been working on for three years. They had followed a group of almost two thousand parents, all of whom had at least one child under the age of seventeen, to test a simple relationship: Could various pro-vaccination campaigns change parental attitudes toward vaccines? Each household received one of four messages: a leaflet from the Centers for Disease Control and Prevention stating that there had been no evidence linking the measles, mumps, and rubella (M.M.R.) vaccine and autism; a leaflet from the Vaccine Information Statement on the dangers of the diseases that the M.M.R. vaccine prevents; photographs of children who had suffered from the diseases; and a dramatic story from a Centers for Disease Control and Prevention about an infant who almost died of measles. A control group did not receive any information at all. The goal was to test whether facts, science, emotions, or stories could make people change their minds.

The result was dramatic: a whole lot of nothing. None of the interventions worked. The first leaflet — focussed on a lack of evidence connecting vaccines and autism — seemed to reduce misperceptions about the link, but it did nothing to affect intentions to vaccinate. It even decreased intent among parents who held the most negative attitudes toward vaccines, a phenomenon known as the backfire effect. The other two interventions fared even worse: the images of sick children increased the belief that vaccines cause autism, while the dramatic narrative somehow managed to increase beliefs about the dangers of vaccines.

I’d wager that most thoughtful people would like to think they’re free from this effect. You and I expect, most likely, that new facts that are at odds with our beliefs would cause us to change our minds. Nyhan is not so sure. He’s seen blind resistance to facts on both sides of the political aisle — people on the right who are convinced that global warming is a hoax, no matter what the data; and people on the left who campaign against GMOs with silly arguments unsupported by science.

Plus, there’s the Australian experiment.

Students Down Under were given a story about a liquor store robbery in which an aboriginal man was fingered as a suspect; subsequently, they were given a retraction saying that the suspect wasn’t aboriginal after all. The ones who, in a separate test, had scored higher for racist attitudes,

… still relied on race in their inference responses, saying [despite having read the retraction] that the attackers were likely aboriginal or that the store owner likely had trouble understanding them because they were aboriginal.

But the effect worked both ways. The researchers repeated the experiment with a story in which an aboriginal man was the hero who disarmed the robber. Then they retracted the info about the samaritan’s race. Guess what?

Students who had scored lowest in racial prejudice … persisted in their reliance on false information, in spite of any attempt at correction. In their subsequent recollections, they mentioned race more frequently, and incorrectly, even though they knew that piece of information had been retracted.

False beliefs, says Konnikova,

… have little to do with one’s stated political affiliations and far more to do with self-identity: What kind of person am I, and what kind of person do I want to be? All ideologies are similarly affected.

The first step to overcoming bias is realizing how susceptible to it we are ourselves. Given the research results, that’s a herculean task for even the smartest brains.

