confirmation bias

"It is the peculiar and perpetual error of the human understanding to be more moved and excited by affirmatives than by negatives." --Francis Bacon (True, as long the affirmatives support your beliefs about anything but yourself or people you don't like and the negatives oppose your beliefs about anything but yourself or people you don't like. When it comes to the self or people we don't like, we seem to be much more affected by negative views than positive views. See the entry on negativity bias.)

Confirmation bias refers to a type of selective thinking whereby one tends to notice and to look for what confirms one's beliefs, and to ignore, not look for, or undervalue the relevance of what contradicts one's beliefs. For example, if you believe that during a full moon there is an increase in admissions to the emergency room where you work, you will take notice of admissions during a full moon, but be inattentive to the moon when admissions occur during other nights of the month. A tendency to do this over time unjustifiably strengthens your belief in the relationship between the full moon and accidents and other lunar effects.

This tendency to give more attention and weight to data that support our beliefs than we do to contrary data is especially pernicious when our beliefs are little more than prejudices. If our beliefs are firmly established on solid evidence and valid confirmatory experiments, the tendency to give more attention and weight to data that fit with our beliefs should not lead us astray as a rule. Of course, if we become blinded to evidence truly refuting a favored hypothesis, we have crossed the line from reasonableness to closed-mindedness.

[new] One need dig no deeper than confirmation bias when looking for an explanation as to why so many intelligent people believe that the positions of stars, planets, the sun, and the moon affect or determine such things as personality traits or personal fates. Astrology is just one example of a belief system easy to confirm with data. The problem is that this and similarly grounded belief systems are not tested by trying to falsify basic claims--the way of science--but rather by making extraordinary efforts to confirm such claims. Confirmation bias becomes most obvious when one looks at how seemingly negative data is dealt with. There is always an explanation given to justify rejecting the negative data; often this takes the form of introducing some further astrological alignments that are said to nullify the apparent falsification. Astrology is not the only field in which this happens. Criminal profiling is another area where confirmation bias has led many intelligent law enforcement people to believe in the talents of profilers that are based on little more than retroactive validation of vague or ambiguous claims made by the profiler. Profilers, in fact, resemble psychics who rely on the same sort of retroactive validation of vague, ambiguous, or obscure "predictions."

Two other beliefs put forth as scientific based on large bodies of evidence characterized mainly by confirmation bias are phrenology and eugenics. The main tenet of phrenology is that the structure of the skull reveals a person's character and mental capacity. Phrenology has been thoroughly discredited and has been recognized as having no scientific merit, yet it was advocated by many scientists and medical professionals in the 19th century in Europe and America. The believers in phrenology had no trouble finding cases that fit their beliefs but nobody, it seems, attempted to test the belief by trying to falsify it, as good science requires.

Of course, using vague terms as keys to validating a theory makes child's play of confirming one's biases. Phrenology used terms such as 'benevolence' and 'self-esteem,' whereas eugenics used terms such as 'feebleminded,' 'idiot,' 'moron,' 'imbecile,' 'inferior blood,' 'defective strains,' and 'unfit.' Such terms gave defenders of eugenics--the sterilization of those deemed 'unfit' and the encouragement of breeding among those deemed 'fit' or 'superior'--a blank check to designate whomever fit their idea of worthy or unworthy of breeding as evidence of the rightness of their beliefs. For more on the popularity and application of eugenics in the United States before it became part of Hitler's attempt to eliminate inferior people (as defined by him) and promote the absurd idea of the Aryan thoroughbred see the chapter on eugenics in Siddhartha Mukherjee's The Gene: An Intimate History (2016). See also Yuval Noah Harari's Sapiens: A Brief History of Humankind, p. 231ff. (2016).

Confirmation bias seems especially pernicious when it comes to causal studies. Authors such as Malcolm Gladwell have turned confirmation bias into a successful formula for writing best sellers. The trick is to make a claim that something is a necessary condition for something else (A is necessary for B to occur) and then back it up with dozens of entertaining and colorful anecdotes where A happened and then B happened. The problem with this kind of thinking is that it makes no effort to discover cases where B happened but A didn't occur or where A happened and B didn't occur. It may be true, as Gladwell has shown, that many times intuitive thinking is correct or that 10,000 hours of hard work were put in by successful people or that many successful people happened to be in the right place at the right time but that doesn't mean that there is an essential connection between any of these things. What about the many cases where intuition was wrong? What about the many cases where success came to people who put in hardly any work or where failure came to people who put in their 10,000 hours? What about the many people who were in the right place at the right time but still failed to achieve success? (Since 'being in the right place at the right time' is so ambiguous I won't bother to ask about those who were in the right place at the right time but didn't recognized it and failed.)

There have been many studies claiming, or implying by their narrative, that a causal connection exists only because they found x followed y. It may be true that one successful company (Malcolm Gladwell in The Tipping Point) or many successful companies (Jim Collins in From Good to Great) followed a similar pattern. But without comparing other companies that either had x but didn't produce y, or produced y but didn't have x, we have no idea whether we're dealing with a causal event or a coincidence. Ignoring or not even trying to find cases that don't fit the pattern makes one's case look far stronger than it really is.[/new]

Numerous studies have demonstrated that people generally give an excessive amount of value to confirmatory information, that is, to positive or supportive data. The "most likely reason for the excessive influence of confirmatory information is that it is easier to deal with cognitively" (Gilovich 1993). It is much easier to see how a piece of data supports a position than it is to see how it might count against the position. Consider a typical ESP experiment or a seemingly clairvoyant dream: Successes are often unambiguous or data are easily massaged to count as successes, while negative instances require intellectual effort to even see them as negative or to consider them as significant. The tendency to give more attention and weight to the positive and the confirmatory has been shown to influence memory. When digging into our memories for data relevant to a position, we are more likely to recall data that confirms the position (ibid.).

Researchers are sometimes guilty of confirmation bias by setting up experiments or framing their data in ways that will tend to confirm their hypotheses. They compound the problem by proceeding in ways that avoid dealing with data that would contradict their hypotheses. For example, some parapsychologists used to engage in optional starting and stopping in their ESP research. Experimenters might avoid or reduce confirmation bias by collaborating in experimental design with colleagues who hold contrary hypotheses, as Richard Wiseman (skeptic) and Marilyn Schlitz (proponent) have done.* Individuals have to continually remind themselves of this tendency and actively seek out data contrary to their beliefs. Since this is unnatural, it appears that the ordinary person is doomed to bias.

See also ad hoc hypothesis, backfire effect, cognitive dissonance, communal reinforcement, control study, motivated reasoning, selective thinking, and self-deception.

For examples of confirmation bias in action, see "alternative" health practice, curse, ESP, intuitive, lunar effect, personology, plant perception, the Sokal hoax, therapeutic touch, and thought field therapy.

To see confirmation bias at work, review the conspiracy theories offered for the JFK assassination or the 9/11 conspiracy theories. It is a good lesson to observe how easily intelligent people can see intricate connections and patterns that support their viewpoint and how easily they can see the faults in viewpoints contrary to their own. As long as one ignores certain facts and accepts speculation as fact, one can prove just about anything to one's own satisfaction. It is much harder cognitively, but a requirement of good science, to try to falsify a pet hypothesis.

reader comments

further reading

books and articles

Belsky, Gary and Thomas Gilovich. Why Smart People Make Big Money Mistakes-And How to Correct Them: Lessons from the New Science of Behavioral Economics (Fireside, 2000).

Evans, B. Bias in Human Reasoning: Causes and Consequences (Psychology Press, 1990).

Gilovich, Thomas. How We Know What Isn't' So: The Fallibility of Human Reason in Everyday Life (New York: The Free Press, 1993).

Levine, Robert. The Power of Persuasion - How We're Bought and Sold by (John Wiley & Sons 2003)

Reason, James. Human Error (Cambridge University Press 1990).

Shermer, Michael. The Borderlands of Science: Where Sense Meets Nonsense (Oxford University Press 2002).

Shermer, Michael. Why People Believe Weird Things: Pseudoscience, Superstition, and Other Confusions of Our Time 2nd ed. (Owl Books 2002).

websites

Coincidences: Remarkable or Random? by Bruce Martin

Smart People Believe Weird Things: Rarely does anyone weigh facts before deciding what to believe by Michael Shermer

Schlitz, M., Wiseman, R., Radin, D., & Watt, C. (2005). Of two minds: Skeptic-proponent collaboration within parapsychology. Proceedings of the 48th Annual Convention of the Parapsychological Association. USA. 171-177.