Here is a PDF (Schwarz et al) that discusses attempts to improve decision-making – and the frequent failures of these attempts.

Ironically, the more people try to consider the opposite, the more they often convince themselves that their initial judgment was right […] Similar surprises arise in the domain of public information campaigns. Presumably, erroneous beliefs can be dispelled by confronting them with contradictory evidence. Yet attempts to do so often increase later acceptance of the erroneous beliefs, as known since Allport and Lepkin’s pioneering research (1945) into rumor transmission. Again, the unintended effect arises because the educational strategy focuses solely on information content and ignores the metacognitive experiences that are part and parcel of the reasoning process.

The section of the quote that I have bolded has implications for sceptics wishing to debunk myths – including those about climate change, HIV/Aids, vaccination and other important topics.

On page 20 of the PDF, the authors state that:

Any attempt to explicitly discredit false information necessarily involves a repetition of the false information, which may contribute to its later familiarity and acceptance. Although this problem has been known since Allport and Lepkin’s research (1945) into wartime rumors, the idea that false information needs to be confronted is so appealing that it is still at the heart of many information campaigns.

The authors then discuss “Spreading Myths by Debunking Them”, using the example of a flyer published by the Centers for Disease Control (CDC). It illustrates a common format of information campaigns that counter misleading information by confronting “myths” with “facts.” This format is also used by sceptical bloggers. The myths and facts, in the case of the CDC flyer, relate to flu vaccines.

Skurnik, Yoon, and Schwarz (2007) gave participants the CDC’s “Facts & Myths” flyer or a parallel “Facts” version that presented only the facts. Here is the bit I find most interesting:

Right after reading the flyer, participants had good memory for the presented information and made only a few random errors, identifying 4% of the myths as true and 3% of the facts as false. Thirty minutes later, however, their judgments showed a systematic error pattern: They now misidentified 15% of the myths as true, whereas their misidentification of facts as false remained at 2%.

It turns out that familiar statements are more likely to be accepted as true: “This familiarity bias results in a higher rate of erroneous judgments when the statement is false rather than true, as observed in the present study.” The attempt to debunk myths actually facilitates their acceptance after a short delay (a delay of only 30 minutes in the study by Skurnik, Yoon, and Schwarz).

So: if fisking misleading articles, debunking myths, and tackling misinformation involves repeating the myths, then sceptics (or, if you prefer, “skeptics”) may actually be making the situation worse and helping to propagate the myths and misinformation in question.

What to do? Well, we could make the facts familiar to people – and make the familiarity bias work in our favour. If you have factual information that would help counter misinformation, then the best tactic may be to state the information, and to repeat it sufficiently often for the true statement(s) to ‘enter into the public consciousness’.

Schwarz et al consider the possibility of using “memorable slogans that link the myth and fact [and] may provide a promising avenue”, but they do not seem to recommend this course of action:

In most cases, however, it will be safer to refrain from any reiteration of the myths and to focus solely on the facts. The more the facts become familiar and fluent, the more likely it is that they will be accepted as true and serve as the basis of people’s judgments and intentions.

I recently wrote about squalene, an ingredient in some vaccines (including the swine flu / H1N1 vaccine). The post related to myths and misinformation regarding squalene and in order to write this post, it was necessary to make reference to the misinformation. Even by pointing out that there was a spurious link between squalene and a certain syndrome, I may have been helping to perpetuate the myth. Perhaps rather than mentioning the link, I should have simply written:

Squalene is a component of some adjuvants that is added to vaccines to enhance the immune response. A naturally occurring substance found in plants, animals and humans, squalene is synthesized in the liver and circulates in the human bloodstream. It is also found in a variety of foods, cosmetics, over-the-counter medications and health supplements.

Tackling misinformation without inadvertently helping to spread it may be tricky, but I think it is something that is worth attempting. If you have any bright ideas, then please feel free to post them in the comments section below.