Environmentalists also filter bad news, but in a far more elaborate way, so I tend to concentrate on them in describing how Flatland works. Personally, I divide the human world into mammals and reptiles. I prefer to write about the mammals

Those who reject mainstream climate science (truthers, deniers, doubters) illustrate a highly generalized cognitive process which I called "bad news filtering" in the first Flatland essay . Climate science is nothing but bad news. A significant proportion of the population, at least in the weird world called the United States, filters such bad news in the crudest kind of way—simple denial. These "doubters" use all sorts of unconscious post-hoc strategies to rationalize their rejection of the science.

Personally, I like the term "climate truthers," which better captures the flavor of the thing. It's not like "those who reject mainstream climate science" all have the same story about why they reject it. There are dozens of varieties of counter-theories, as many as there are theories about Kennedy's assassination. What unites them all is a conviction that the official story can't be right, that it's covering for a nefarious agenda, that the truth is out there.

My hot take: It doesn't really matter. Nothing of particular consequence rests on what journalists choose to call these people. Certainly nothing that would justify the endless hours of debate that have been devoted to it.

[The Associated Press], in its Solomonic wisdom, has elected to eschew both terms. Instead it's going with "climate change doubter," or alternatively, "those who reject mainstream climate science," which trips right off the tongue.

I don't go after "climate deniers" on DOTE because the psychological stuff I'm trying to describe is far more subtle than the example such people provide. There's nothing subtle about "climate truthers," which is Dave Roberts' preferred term in AP says to call climate deniers "climate doubters." Whatever. (Vox, September 24, 2015).

Oh, somewhere in this favored land, the sun is shining bright, The band is playing somewhere, and somewhere hearts are light; And somewhere men are laughing, and somewhere children shout, But there is no joy in Mudville—mighty Casey has struck out — Casey At The Bat by Ernest Lawrence Thayer

How and why do people filter bad news? As we've just seen, with "climate doubters" the 'how' is simple. The 'why' is more subtle. As I read through Roberts' article, a remarkable thing happened—he basically summarized large parts of the third flatland essay in a few pages. He even quotes some of the sources I used in that essay. This is amazing, considering that he has never read my stuff. I'm way, way off his beaten path.

This is an excellent summary, so please take the time to read it.

Conclusions are usually the beginning, not the end, of human reasoning The popular conception of reason is that it poses a question, gathers evidence, weighs the evidence, and draws a conclusion. But that turns out to be a highly idealized conception, tracing back to positivism and the Enlightenment. In fact, human beings are not primarily rational creatures. We are primarily social creatures. We are born into specific social contexts, overlapping tribes from which we absorb our worldviews and values. We stitch our identities together out of those tribal affiliations. Most of what we believe, we do not conclude. We do not reason to it at all. We inherit it. Those inherited beliefs are often tribal markers, conditions of approbation, even acceptance, among our tribes. Because belonging to tribes is fundamental to our well-being, those markers become very important to us. Protecting them is adaptive behavior, among our most basic instincts. And protect them we do, via what social scientist Dan Kahan calls "identity-protective cognition." (Ezra Klein wrote about it here.) We are primed to resist information that casts doubt on our core beliefs and values. Such resistance is deeply rooted, often operating at a level beneath conscious awareness. So reasoning in the real world is usually the opposite of the idealized conception. Rather than ask questions and reason to conclusions, what most of us do, most of the time, is begin with conclusions and reason to justifications. We use our cognitive powers to build a case for what we are already inclined, for reasons of tribal affiliation, to believe. To use the analogy favored by psychologist Jonathan Haidt (via Chris Mooney), we think we're being scientists, but we're actually being lawyers. That's motivated reasoning. It's not a flaw, a weakness, or an exception. It's likely what cognition is for. After all, why would evolution select for a species of pure reasoning machines? Beyond our ability to successfully navigate our immediate surroundings, we don't really need accurate information, certainly not at the level of basic worldviews. It doesn't have a ton of adaptive value. What does have adaptive value is our ability to access the benefits of community, to enter into reciprocal relationships with others around us for mutual benefit. That's what evolutionary pressures are likely to select for — the master to which reasoning is a servant.



"We're wrong, but we're wrong together!"

This is not to say that inquiry is futile. It is possible to make self-criticism and self-correction themselves tribal values, to encourage the cognitive effort (and cushion the social risk) of skepticism. It is possible for individuals to develop, through practice, the mental habits of self-analysis and openness to revision. And science itself is, of course, an attempt to formalize those mental habits and be rigorous in their use. But it's difficult to reason that way, exhausting both intellectually and psychologically, so most people, most times, don't.

OK, pay close attention to what he says here.

Now, motivated reasoning is strikingly easy to recognize ... in other people, who disagree with you. It is much more difficult to recognize when you and your cohort do it. But all tribes are guilty of it.

Now, Roberts sets up where he's going to go next.

Nonetheless, it's facile to conclude that "everyone does it" and leave it there. Some tribes and individuals are more prone to it than others, and in varying amounts on different issues and in different circumstances.

And who are those tribes and individuals who are more prone to motivated reasoning than others?

Climate doubters of course! Fuckheads! And thus Roberts engages in some motivated reasoning of his own.

Climate denialism is identity protection on the American right Which brings us back to the climate truthers. What's happened is that rejection of mainstream climate science has become a powerful tribal marker in the US conservative movement. And it is mostly the US conservative movement. As Jonathan Chait pointed out in his recent piece, there are smatterings of similar science rejection in Australia and in European splinter parties, but nowhere else in the developed world has climate, er, doubt taken over a major political party. It is, as he says, "a regional quirk in the most powerful country on Earth." The question of how exactly the truthers' disposition toward climate science might best be described — skepticism, doubt, paranoid hostility, whatever — is beside the point; what's going on has little to do with how they assess climate science. The story of climate trutherism is not primarily about science at all, or even about climate change. It is part of something much larger, a story of how the US right became radicalized, aggrieved, empowered, and epistemically insular. It is the story of asymmetrical polarization, which, as I've said before, is key to understanding American politics...

And so on. As I said, it is very easy to hammer on reptiles. And it is very easy to politicize climate policy because that's what humans characteristically do. And it is very easy to say my tribe sits at God's Right Hand (environmentalists), and your tribe dwells among Satan's Legions (conservative deniers).

Quoting Roberts, what is not easy to do is making "self-criticism and self-correction themselves tribal values, to encourage the cognitive effort (and cushion the social risk) of skepticism. It is possible for individuals to develop, through practice, the mental habits of self-analysis and openness to revision."

For environmentalists, the road to self-criticism and self-correction begins with taking a skeptical look at these stories—

that we can run our global civilization predominantly on renewable energy;

that switching to "clean energy" is basically free and good for everybody in the long run;

that no sacrifices are required;

that switching to renewables is compatible with global economic growth;

Self-criticism requires getting out of the tribal bubble (group-think). Roberts would need to start studying energy from a physical standpoint. He would need to take a serious look at the historical and current role of energy in running our global economy. He could start here, for example. Roberts and others in his tribe never look at this kind of stuff as far as I know.

Tribal introspection doesn't occur because self-criticism and self-doubt constitute an existential threat to the tribe in question. And since one's basic identify is inextricably tied to the fate of the tribe, there is an existential threat to the self. So self-criticism of this sort is "bad news" in the Flatland sense. And what do people do with this kind of bad news?

They filter it. And that's what Roberts did when, predictably, he went down a well-worn political path compatible with his tribal values. It's too bad he wasn't able to take his own observations seriously. It's too bad he couldn't bring himself to follow the road less traveled.

There may be hope for Dave Roberts. He failed here, but one day he may succeed in understanding what he thinks he understands. You may recall this stunning paragraph from his summary. I'll make a few comments.

Motivated reasoning is not a flaw, a weakness, or an exception. It's likely what cognition is for.

Well, if the fate of the biosphere and our species is at stake, this human quality sure looks like a flaw to me.

After all, why would evolution select for a species of pure reasoning machines? Beyond our ability to successfully navigate our immediate surroundings, we don't really need accurate information, certainly not at the level of basic worldviews. It doesn't have a ton of adaptive value.

But in the 21st century, with our fate hanging in the balance, it would seem that "accurate information" (aka., reality) does indeed have a ton of adaptive value.

C'mon, Dave, try harder! There is no joy in Mudville—mighty Roberts has struck out.