By Jeremy Deaton

Last week, EPA Administrator Scott Pruitt said he does not believe carbon pollution is the primary cause of climate change, which is factually incorrect, as explained by the EPA’s own website. Pruitt’s statement generated headlines ranging from the incisive “EPA Chief Doubts Consensus View of Climate Change,” to the uncritical “EPA Chief Scott Pruitt Says Carbon Dioxide Is Not a Primary Contributor to Global Warming,” drawing an exasperated response from media watchdogs.

Numerous reporters challenged Pruitt’s assertion. Weatherman Al Roker said “there is no credible science or scientist” who would agree with the EPA chief. But that doesn’t mean the facts are clear to the average news consumer. The problem with Pruitt’s claim — as with so many fibs, falsehoods and fabrications— is that it’s sticky. When inundated with misinformation, we start to believe the lie. We remember fiction as fact, even after a thorough debunking.

But cognitive science offers tools for fighting back against the barrage of inaccuracies coming from the Trump administration. Scholars lay out the best weapons for winning a war of ideas. Here is why human brains are so bad at separating what is real from what is not — and how to combat climate denial.

Not good at telling truth from fiction. Source: Pixabay

A lie told often enough becomes the truth.

Human brains are built to ward off singular untruths, but we struggle against an army. When faced with an onslaught of lies, our defenses falter, letting alternative facts slip past the barricade. There are several reasons why this is the case. Here are three:

It takes energy to scrutinize a lie.

It takes more energy to scrutinize it when we hear that lie again and again.

We don’t like to scrutinize a lie that supports our worldview.

In a landmark 1991 paper, Harvard psychologist Dan Gilbert proposed that we process information in two steps. First, we accept information as true, and then we interrogate whether it may actually be false. In other words, we let the Trojan horse past the gate before we check to see if it’s full of Greek soldiers. Humans, wrote Gilbert, are “very credulous creatures who find it very easy to believe and very difficult to doubt.”

Artist’s rendering of an alternative fact. Source: Public Domain

Doubting takes effort, and it takes a lot more effort when we hear a lie again and again and again. In a 1977 study, researchers asked college students to rate the veracity of dozens of statements over multiple sessions. Some statements were true (“The thigh bone is the longest bone in the human body”) and some sounded true, but were actually false (“Tulane defeated Columbia in the first Sugar Bowl Game”). Students were more likely to rate a statement as true if they had heard it previously, even if the statement were false, suggesting the most dangerous kind of Trojan horse is one that you’ve seen before.

The task of sorting truth from fiction gets significantly more complicated when politics is involved. We identify as liberals or conservatives, Democrats or Republicans, and we tend to work, live, drink and fall in love with people who share and reinforce our worldview. Knowledge is social. We trust friends, family, and leaders of the same political stripe to help us understand the news.

Source: Pixabay

A recent study found that Democrats are less likely to trust information when they’re told it comes from Donald Trump. Republicans are more likely to do so. MIT political scientist Adam Berinsky, a coauthor of the study, said it’s incumbent upon politicians to fact-check members of their own party. “In a partisan time, the solution to misinformation has to be partisan,” he said, “because there just aren’t authorities that will be recognized by both sides of the aisle.”

Pruitt offers a fine example. A long-serving Republican with deep ties to the fossil fuel industry, Pruitt is a credible source of information for conservatives. To those with little knowledge of climate, his claim that carbon dioxide is a bit player in global warming might sound plausible — all the more so after repeated tellings.

So how should scientists, advocates and climate-conscious political leaders respond to Pruitt? Science shows the way.

Artist’s rendering of lie being debunked. Source: Public Domain

Start with the truth. Inoculate against the lie.

When confronted with a lie, it’s tempting to correct it, but that strategy often backfires. People tend to remember the fib, not the fact-check. If they find the lie agreeable, they tend to dismiss the correction out of hand. Renowned cognitive linguist George Lakoff harped on this point in a recent interview with CNN.

“If you repeat what Donald Trump says and negate it, and say no, and say it’s false, what you’re doing is strengthening that, because in your brain, the neurocircuits have to activate what you’re negating and that strengthens what you’re negating,” said Lakoff. Journalists, he argued, need to “talk first about the truth.”

Put another way, there is little to gain from opening up a Trojan horse and counting the number of soldiers inside. The trick is never letting it past the gate in the first place. Instead, you have to take the battle to your opponent. Lay it out like this:

Here are the facts.

Here’s how they lied to you.

Here’s why they lied.

You have to start with the truth for the same reason you have to avoid repeating the lie. People remember the headline. In a handbook on countering climate deniers, researchers John Cook and Stephen Lewandowsky write that “debunking should begin with emphasis on the facts, not the myth.” They made a helpful graphic to illustrate this point:

Once you’ve established the facts and repeated them to the point of exhaustion, move on to dispensing with the lie. A recent study showed that it’s possible to “inoculate” the public against misinformation about climate change.

In the experiment, one group of subjects read a message explaining that 97 percent of climate scientists believe that humans are driving the rise in temperature. They then read a second message that cast doubt on the scientific consensus. Another group read both messages along with a warning that “some politically motivated groups use misleading tactics to try to convince the public that there is a lot of disagreement among scientists.”

Used to guard against measles, mumps and misinformation. Source: Pixabay

Those who had been warned about misinformation were more likely to believe that scientists agree on climate change. Key to the warning was that it clarified both the intent and the method of climate change deniers . These were “politically motivated groups” wielding “misleading tactics.”

Let’s apply these lessons to Scott Pruitt. Numerous outlets repeated the lie at the top of the story. CNBC’s headline read, “EPA Chief Scott Pruitt Says Carbon Dioxide Is Not a Primary Contributor to Global Warming.” The article began with the falsehood and buried the fact-checking.

A better headline might have been “EPA Chief Denies Scientific Consensus on Climate Change.” The subsequent text could have explained that, while 97 percent of scientists agree that humans are causing climate change, Pruitt contends there is “tremendous disagreement about the degree of [human] impact.” Pruitt, a longtime ally of fossil fuel companies, stands at odds with NASA, NOAA and the EPA, which state that carbon pollution from burning oil, coal and natural gas is driving the warming trend.

This many climate scientists agree that humans are causing global warming. Source: The Climate Chat

A note about communications

David Roberts of Vox wrote in a recent post that the focus “by earnest liberals and scientists on particular climate skeptic arguments is pointless.” The obsession over particular word choice — global warming, climate change, climate disruption — is a waste of time. Conservatives, said Roberts, “will accept the scientific facts of climate change when conservative elites signal that that’s what conservatives do.”

By and large, the research supports this view. Elite opinion is key. Even a well-crafted message can only do so much. Where studies show one message to be more effective than another, effects are typically small, and experiments have limited external validity. The closed environment of social science experiments bears little resemblance to the noisy, frenetic, bifurcated media ecosystem we actually inhabit.