Arguments over publicly controversial scientific issues like climate change commonly include a lot of accusations of ignorance, naïveté, and ulterior motives. Take a step back from “why is this person such a lughead?” though, and a much better question arises: why are opinions so strongly tied to political affiliation? Why should liberals and conservatives come to such different views of what the scientific evidence does or does not show? If your brain just started compiling a laundry list of reasons why those on the other side of the issue from you obviously become mislead, stop for a moment and consider your side as well.

Researchers have come up with several possible explanations for these systematic divergences in public opinion. A new study in the journal Judgment and Decision Making describes a head-to-head test of the most prevalent ones, done in an attempt to find out which one best describes what’s really going on.

There are three hypotheses in play here. The first refers to what’s known as “dual process reasoning,” a model of human thinking in which we can engage with ideas on two levels. The first is quick and dirty, leaning on intuition and emotion. The second is slow and deliberative, resulting in more objective and rational decisions. If people are forming their opinions on the quick and dirty level without careful, logical consideration, then public controversies may be inevitable.

The second explanation pins the blame on purported differences between the thought processes of liberals and conservatives. This view, popularized by Chris Mooney in books like The Republican War on Science and The Republican Brain, holds that conservatives shy away from complexity or uncertainty. This would make the right side of the political spectrum more susceptible to being misled on complex issues such as climate change.

The third hypothesis is essentially “cultural cognition,” a concept developed by the current study’s author, Yale’s Dan Kahan. The idea here is that everyone (to some extent) judges the reliability of information based on its implications for our cultural identity. Disregarding climate change—and its regulatory consequences—can be part of maintaining a group identity for someone who identifies as a conservative. Opposing genetically modified food can be important for someone who identifies as a liberal in a community of like-minded liberals.

In the study, a group of 1,750 people representative of the US population (by political affiliation, race, education, etc.) responded to some survey questions meant to test these hypotheses. Participants first described themselves politically (strong Republican to strong Democrat) and ideologically (very conservative to very liberal). They then took a test that is commonly used to assess deliberative, reflective thinking; it consisted of three mathematical questions with seductively intuitive yet incorrect answers. For example, one asked, “If it takes 5 machines 5 minutes to make 5 widgets, how long would it take 100 machines to make 100 widgets?” Most people answer these questions incorrectly—the average score here was about 0.7 out of 3.

The respondents were then split into three groups, each of which was asked to assess how effective they thought the test was at indicating how reflective and open-minded a person is. One group (the control) was simply told that psychologists believe the test is, indeed, effective. Two other groups were given additional information that was fodder for culturally weighted reasoning.

A second group was told that the test was effective as well, but they were also told that “in one recent study, a researcher found that people who accept evidence of climate change tend to get more answers correct than those who reject evidence of climate change” and so are judged to be more open-minded. The third group was told the opposite—that a recent study showed that people who reject evidence of climate change fared better on the test.

The three hypotheses for politically polarizing issues are all different avenues by which someone could end up employing “motivated reasoning”—reasoning that comes to convenient conclusions rather than the most objective ones. People who accept climate change are likely to chafe at the suggestion that their fellow “accepters” are more closed-minded than climate skeptics. As a result, they could be motivated to come to the conclusion that the test isn’t very reliable. This is meant to simulate the way in which people judge reports of evidence for or against the positions they hold.

If this was primarily a question of intuitive versus deliberative thinking, you would expect those with lower scores on the test to display stronger motivated reasoning, regardless of political orientation. If conservatives are more prone to motivated reasoning than liberals, you’d expect that to be the most important factor. But if cultural identity is the central issue, both ends of the political spectrum should show stronger motivated reasoning than the middle. In fact, people who think more deliberatively (as measured by the test) on the far-left or far-right would show the greatest degree of motivated reasoning, since deliberative thinking gives them time to recognize threats to their identity and come up with better reasons why inconvenient evidence could be flawed.

The test results showed no difference in intuitive/deliberative thinking between ideologies (liberal/conservative). Just considering party affiliation, however, did show a statistically significant gap. Republicans averaged a little over 0.1 points (out of 3) higher than Democrats.

The experimental groups evaluating the effectiveness of the test displayed a fair amount of motivated reasoning, as expected. One average, liberals rated the test as less effective when told that people who accepted climate change performed poorly on it, and they rated it as more effective when told that it was the climate skeptics who didn’t do as well. Conservatives did the same thing, and to a similar degree.

So far, the “conservatives don’t think as objectively” hypothesis is in trouble, as conservatives showed no more motivated reasoning than liberals. So were the people who scored poorly on the deliberative thinking test more likely to be motivated reasoners? Nope. Bad news for the “dual process reasoning” hypothesis.

That leaves us with cultural identity. It turns out that higher scores on the test were associated with stronger motivated reasoning on the far left and far right. So not only were people polarized along political lines, but the ones who were supposedly more deliberative thinkers were even more polarized, as predicted.

This study adds its weight to others that have shown the importance of cultural identity in our evaluation of information—regardless of which cultural group we identify with. Kahan doesn’t see this as a bias so much as a practical adaptation to life in social groups.

But it can lead, Kahan says, to a situation like the “tragedy of the commons," where rational decisions for individuals (I’ll raise more cattle on this public pasture land) result in negative outcomes for all (the land has been overgrazed). The individual tendency to preserve the connection with one’s cultural group can ultimately split society as a whole, preventing it from acting on the best available evidence. So instead, we call each other lugheads.

Judgment and Decision Making, 2013. (Open Access)