WARNING: Before you read this article, beware that the more scientifically literate you are, the more likely you are to misinterpret this information in a way that supports your own ideology.

Sorry. According to psychologist and political scientist Dan Kahan, even seeing that warning won’t help you. “Being warned about cognitive biases doesn't immunize someone from them,” says Kahan, who works at Yale University.

So reader, continue at your peril, as you learn that Kahan is now embroiled in a politically charged controversy in which, he says, his own research on the misinterpretation of scientific research is being deeply misinterpreted.

“I’m so tired of this,” he tells ScienceInsider.

It started innocently on 15 October. On his blog, Kahan posted an informal analysis of survey data that compares people’s comprehension of scientific concepts and their political outlook. The data were gathered from a large U.S. study of how people perceive the risk of vaccination. And when Kahan crunched the numbers, they revealed a small correlation between science comprehension and political leaning. One finding: Those who identified themselves as “liberal” tended to have greater scientific comprehension than those who self-identified as “conservative.”

Or, as Kahan put it: “The sign of the correlation indicates that science comprehension decreases as political outlooks move in the rightward direction--i.e., the more ‘liberal’ and ‘Democrat,’ the more science comprehending.” Statistically, the effect was small—a correlation coefficient of r = 0.05—and only weakly significant, with a probability of p = 0.03. That is just under the traditionally accepted threshold of p = 0.05 that researchers use to identify a correlation that is unlikely enough to be the result of chance alone.

Many studies of people’s ideological leanings and ability to parse scientific information have found similar correlations. It has added up to the widespread perception that politically conservative beliefs go hand in hand with poor scientific understanding.

But Kahan cautions that this interpretation, known as the asymmetric hypothesis, is itself an example of the misinterpretation of scientific information. And he argues that the available data instead supports the symmetric hypothesis, which holds that such biases apply equally to liberal-leaning people.

To push back against the tide of misunderstanding and dampen the polarization, in his blog post, Kahan pointed out that survey participants who self-identified with the libertarian-leaning Tea Party movement also showed a slightly higher affinity for understanding scientific concepts. Again, the effect was tiny (r = 0.05) and even less significant (p = 0.05).

“[T]he relationship is trivially small, and can't possibly be contributing in any way to the ferocious conflicts over decision-relevant science that we are experiencing,” Kahan wrote. But he hoped that pointing out the Tea Party factoid would dampen the political polarization, perhaps giving readers pause before making generalizations.

The post, however, had the opposite effect. One of the first shots fired was an article on POLITICO titled “Eureka! Tea partiers know science.” More than 20,000 Facebook “likes” later, the website of conservative pundit Glenn Beck shared the good news that, “much to [Kahan’s] surprise, the data found a strong correlation between science comprehension and self-identified TEA Party members.” Science writer Chris Mooney then weighed in, noting that it was ironic that Beck, a leading light of the Tea Party, “muffed the statistics” in “trying to show that tea partiers are good at science.” The public discussion grew far more polarized and vitriolic on Twitter.

For his part, Kahan took to his blog on 19 October to decry some of the misreporting, and muse on the ironies, in a post titled, in part: “Congratulations, tea party members: You are just as vulnerable to politically biased misinterpretation of science as everyone else!”

“Tea party members are like everyone else, as far as I can tell, when it comes to science comprehension,” he wrote. “Is this something to be proud of? I don’t think so. It means that if we were to select a tea-party member at random, there would be a 50% chance he or she would say that ‘antibiotics kill viruses as well as bacteria’ and less than a 40% chance that he or she would be able to correctly interpret data from a simple experiment involving a new skin-rash treatment.”

Of the “recurring irony” of the misrepresentation of such results, Kahan wrote: “It’s funny. It’s painful. And it’s depressing—indeed, the 50th time you see it, it is mainly just depressing.”

“I really empathize,” says Michael Dodd, a psychologist at the University of Nebraska, Lincoln, whose own research touches on the effects of political beliefs on perception. By the time the information reaches the public, he says, journalists have already injected their own ideological biases. “When we do interviews on our biology and politics work, interviewers are quite clearly getting us to try to make a claim or give them a sound bite that is provocative, but is an overreach based on the data,” he says. In spite of “bending over backwards” to correct journalists’ misinterpretations, he says, “I can't say a mainstream article has ever been published on my work where I didn't think the author was taking liberties with what we said in our interview or our paper in order to make the story flashier.”

But even a news story about these cognitive biases, like the one you are reading, is unlikely to make a dent, Dodd says. “It's unfortunate and I don't think it will end anytime soon.”

Kahan is somewhat more optimistic. In a lengthy e-mail to ScienceInsider, he notes that “[t]he number of issues on which we see cultural conflict over relevant science is minuscule in relation to the ones in which we don't.” And he argues that people can only ignore science that is relevant to their well-being for so long. “[W]ays of life that fail to align their members with the best available evidence on how to live well,” he writes, “will not persist.”