People reason chiefly to persuade others that they are right, not to find out what is true.

So claim Hugo Mercier, a postdoctoral fellow in economics and philosophy at the University of Pennsylvania, and Dan Sperber, a philosopher and cognitive scientist at the Central European University, in their provocative 2010 article, "Why Do Humans Reason? Arguments for an Argumentative Theory," in the journal Behavioral and Brain Sciences. Now a new study by the Yale Cultural Cognition Project finds that people also use reason to convince themselves, in the face of evidence to the contrary, that they and those on their side are right.

For several years now, the Yale Cultural Cognition Project headed by Dan Kahan has been investigating how cultural values shape public risk perceptions and related policy beliefs. The group has issued a fascinating new working paper, "Ideology, Motivated Reasoning, and Cognitive Reflection: An Experimental Study" that seeks to explain the sources of ideological polarization over societal risks like climate change, gun violence, and nuclear power. This new study investigates three competing theories for why this ideological polarization exists.

The study investigates whether polarization over scientific and risk policy issues arises because people are (1) misled by an over-reliance on rules of thumb, (2) subject to distorting cognitive dispositions such as dogmatism, aversion to complexity, and need for closure, or (3) seeking to bolster and protect their sense of social identity. Kahan categorizes the relevant social psychological research theories into (1) the public irrationality thesis; (2) the Republican Brain hypothesis; and (3) the expressive rationality thesis.

One theory of how this polarization arises is what Kahan calls the public irrationality thesis. Nobel-winning economist Daniel Kahnemann argues that humans employ two different cognitive systems to evaluate new information. The first uses fast cognitive efforts such as rules of thumb while the second relies on slower, more effortful systematic reasoning. People adopt many of their cognitive rules of thumb, heuristics that are triggered by emotional reactions to new situations, from groups with whom they share cultural or ideological commitments. This theory implies that when challenged with new information or arguments, it's just easier for most people to believe what their peers believe. The downside of this knee-jerkism is that there is no tendency for public deliberation to converge on effective public policies to deal with societal risks.

Kahan's second theory is the Republican brain hypothesis. In this case, the supposed negative correlation between political conservatism with traits of open-mindedness and critical reflection means that conservative cognition asymmetrically relies on fast rules of thumb to make decisions with respect to salience of various societal risks.

Put more simply, this theory asserts that conservatives are less likely than liberals to engage in cognitive reflection and effortful systematic reasoning. Consequently, while liberals evaluate information with the aim of developing useful public policies, fearful, mule-stubborn conservatives will dismiss discomforting scientific facts and spend their time just "standing athwart history yelling stop."

The third theory investigated in this study is what Kahan calls the expressive rationality thesis. In this case, beliefs about issues like the riskiness of climate change or nuclear power constitute part of what it means for people to belong to specific groups. Shared beliefs form part of their identities. People assess new information so that their conclusions signal their trustworthiness and loyalty to social groups with which they identify and to which they wish to be connected. The upshot is that people go along in order to get along with members of the social groups that they believe will benefit them.

Kahan points out that both the public irrationality thesis and the Republican brain hypothesis claim that ideological polarization arises from over-reliance on cognitive rules of thumb. The case of expressive rationality is different. "If individuals are adept at using high-level, [systematic] modes of information processing, then they ought to be even better at fitting their beliefs to their group identities," suggests Kahan. In other words, people who have a greater tendency to engage in cognitive reflection will be better able to rationalize their beliefs in the face any contrary evidence. So, if liberals really are more inclined to cognitive reflection and systematic reasoning than conservatives, that would imply that they "are all the more likely to succeed in resisting evidence that challenges the factual premises of their preferred policy positions."

The Yale researchers set out to test these three hypotheses by surveying 1,600 Americans to determine their political party affiliations and ideological predilections. Participants were asked to put themselves on a scale including Strong Democrat, Democrat, Independent Lean Democrat, Independent, Independent Lean Republican, Republican, and Strong Republican. Next they were asked if the considered themselves Very Liberal, Liberal, Moderate, Conservative, or Very Conservative. Once sorted by partisanship and ideology, the participants completed a Cognitive Reflection Test consisting of three questions that aim to measure their dispositions to engage in higher-level systematic reasoning.

Kahan does not provide the questions in his study, but apparently he used something like the cognitive reflection test devised by Massachusetts Institute of Technology management professor Shane Frederick. That test asks: (1) A bat and a ball cost $1.10 in total. The bat costs $1 more than the ball. How much does the ball cost? (2) If it takes five machines five minutes to make five widgets, how long would it take 100 machines to make 100 widgets? (3) In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half the lake? (Note: Answers below.)

Keep in mind that unlike the public irrationality and expressive rationality theses, the Republican brain hypothesis claims that there is an ideological asymmetry with regard to cognitive reflection. If conservatism is in fact associated with reasoning traits that lead to dogmatism, fear of complexity, and need for closure, conservatives should tend to score lower on Kahan's cognitive reflection test than would liberals.

Once participants had completed the cognitive reflection test, they were divided into three experimental groups. In the first group, subjects were merely told that "psychologists believe the questions you have just answered measure how reflective and open-minded someone is." In a second group, subjects were additionally advised that psychologists had determined that people who accept evidence for climate change tend to get more answers correct than those who reject such evidence. This result was characterized by pollsters as implying that those who believe in climate change are more open-minded than are skeptics (skeptics are more close-minded). And the third group was told the opposite; psychological research has found that people who reject evidence for climate change tend to get more answers correct than those who accept such evidence. In this case, the pollsters implied that research suggests that skeptics were more open-minded than believers in climate change (skeptics are more open-minded). Each group was then asked how valid they thought the cognitive reflection test was with regard to assessing open-mindedness.

Recent polling finds that conservatives are more likely to be skeptical about man-made global warming than are liberals. Thus the Republican brain hypothesis would predict that right-wing subjects would be more inclined to see the cognitive reflection test as valid when told it suggests that climate change skeptics are more open-minded. On the flip side, since left-wing subjects are supposed to be more natively reflective, their assessment of the validity of the cognitive reflection test should not differ much between the skeptic is open-minded versus the skeptic is close-minded conditions. The public irrationality thesis predicts that motivated reasoning, i.e., jumping to conclusions congenial to their social groups, will be higher among people who score low on cognitive reflection no matter their ideological biases. Unlike both the public irrationality and the Republican brain hypotheses, the expressive rationality thesis predicts that the higher that both right-wing and left-wing subjects score on cognitive reflection, the more their assessments of the validity of the cognitive reflection test will turn on their prior ideological commitments. The idea is that the ideologically motivated flatter themselves with the belief that people who share their views are more open-minded than those who do not.

So what did Kahan and his colleagues find? It turns out that conservatives and liberals score about equally badly on the cognitive reflection test: 64 percent of Republicans and 59 percent of Democrats got all three questions wrong. In fact, the difference between the two groups of partisans is less than the difference in scores associated with education, gender, and race. Recall that the Republican brain hypothesis predicted that cognitive reflection would be negatively correlated with right-wing ideology. "This hypothesis is not confirmed," concludes Kahan.

In addition, both liberals and conservatives displayed ideological bias when assessing the validity of the cognitive reflection test. When climate change skeptics were characterized as open-minded, Republicans thought the test was nifty. When skeptics were branded as close-minded, more Democrats found the test results convincing. Thus, the study finds that the experimental "results were more consistent with a finding of symmetry than one of asymmetry with respect to ideologically motivated reasoning." Ideology distorts both left-wing and right-wing thinking.

Do higher scores on the reflective cognition test temper political polarization? To get at this question, the study compared both liberals and conservatives who scored low on the reflective cognition test (the 62 percent of subjects who got no answers right) with liberals and conservatives scored higher (those who got an average of 1.6 answers right putting them in between the 80th and 90th percentile of the sample). In short, the researchers found that the higher either conservatives or liberals scored on the cognitive reflection test the more likely they were to judge the test as valid when its results supposedly confirmed their ideological views about climate change skeptics and vice versa. People skilled at systematic reasoning use that capacity to justify their beliefs rather seek out truth.

Kahan notes in passing that social psychological research has found that political independents and libertarians score better on the cognitive reflection than do liberals or conservatives (check your answers below). But before we libertarians and independents start patting ourselves on our collective backs for being the better systematic reasoners, could this simply mean that we are especially good at justifying our beliefs to ourselves?

The new Yale study finds that when it comes to thinking about policy-relevant scientific information that challenges their ideological views liberals, conservatives, and, yes, libertarians, are inclined to violate physicist Richard Feynman's famous "first principle." As the irreverent genius put it, "You must not fool yourself and you are the easiest person to fool."

And the smarter you are, the easier it is to fool yourself.

The correct answers to the cognitive reflection test are 5 cents, 5 minutes, and 47 days.