For a lot of scientific topics, there's a big gap between what scientists understand and what the public thinks it knows. For a number of these topics—climate change and evolution are prominent examples—this divide develops along cultural lines, typically religious or political identity.

It would be reassuring to think that the gap is simply a matter of a lack of information. Get the people with doubts about science up to speed, and they'd see things the way that scientists do. Reassuring, but wrong. A variety of studies have indicated that the public's doubts about most scientific topics have nothing to do with how much they understand that topic. And a new study out this week joins a number of earlier ones in indicating that scientific knowledge makes it easier for those who are culturally inclined to reject a scientific consensus.

What’s the consensus?

The new work was done by two social scientists at Carnegie Mellon University, Caitlin Drummond and Baruch Fishchoff. They relied on a large, regular survey called the General Social Survey, which attempts to capture the public's perspective on a large variety of issues (they used data from the 2006 and 2010 iterations of the survey). The survey included a number of questions on general education and scientific education, as well as a number of questions that determined basic scientific literacy. In addition, it asked for opinions on a number of scientific issues: acceptance of the evidence for the Big Bang, human evolution, and climate change; thoughts on the safety of GMOs and nanotechnology; and the degree to which the government should fund stem cell research.

The survey also included questions on its participants' political and religious identity. The authors performed a variety of statistical tests designed to determine whether there were any correlations among these opinions. Since there were so many factors under consideration, the standard statistical measure—a five-percent chance of a result occurring at random—was deemed insufficient. Instead, the researchers only reported results that had a one-percent chance of occurring at random. Although one percent seems a bit arbitrary, the increased statistical rigor is something that would be good to see more often.

We'll do the good news first: there's no sign of cultural polarization on GMOs or nanotechnology. The former is a bit of a surprise given the widespread public mistrust of this biotechnology (and the frequent claim that the problem arises from a bunch of lefty granola eaters). It would also be easy to envision religious opposition on these topics, given that both involve "playing God" in the sense that humans are creating things that don't commonly occur naturally.

But that's about where the good news ends. Drummond and Fishchoff found strong polarization on most of the other topics.

In terms of stem cell research, evolution, and the Big Bang, those with a stronger general education showed greater political polarization, with conservatives more likely to reject them. For those with a strong science education, those topics were also polarized, as was climate change. In a bit of good news, high levels of scientific literacy removed the Big Bang from that list. Put differently, stem cell research and evolution were consistently polarized along political lines. As scientific literacy went up, climate change became politicized, too, but people were more likely to accept the evidence for the Big Bang.

Partly overlapping effects were seen when religious fundamentalism was considered, the exception being climate change, where opinion wasn't polarized along religious lines. Stem cell research, the Big Bang, and human evolution were, however.

Education vs. science

Overall, Drummond and Fishchoff found that education doesn't make much of a difference when it comes to accepting science. "Participants’ general educational attainment and science education were at best weakly related to their acceptance of the scientific consensus," they conclude. Scientific literacy helped a bit overall, as "those with higher scientific literacy scores were more likely to agree with the scientific consensus on three issues: the Big Bang, human evolution, and nanotechnology."

But that was largely due to the large effect it had among political and religious liberals. In other ways, it hurt, as those with a strong science education or who demonstrated scientific literacy showed higher polarization when it came to stem cells, evolution, and the climate, primarily because conservatives become less likely to accept the scientific consensus.

Ultimately, the thing that matters most is trust. "On all six topics," the authors write, "people who trust the scientific enterprise more are also more likely to accept its findings." The politicization of scientific issues may, in part, be the result of a long-term decline in trust in the scientific enterprise among conservatives.

As always, there are some caveats when it comes to the questions asked in the survey. For example, it's tough to get a firm grasp on scientific literacy from a few survey questions, and the percentages seen in many surveys depend on how various questions are phrased. For example, an individual may answer a question on dinosaurs in a way that acknowledges that they lived millions of years ago while answering a question on human origins by saying they didn't evolve. And the climate question asked how much participants were "concerned" about climate change. It's entirely possible for someone to accept the science of human-driven climate change while rejecting scientists' conclusions of what a 4-degrees-Celsius-warmer world would look like.

Those caveats would be more significant if it weren't for the fact that multiple other studies have seen more or less the same thing. For example, Dan Kahan at Yale found that there's no difference between liberals and conservatives when it comes to knowing what scientists have determined about the climate, and the understanding gets better as climate literacy rises. But ask the same people what they believe, and conservatives with higher climate literacy are less likely to agree with the scientists.

The question is why. Here, the authors propose two mechanisms. One involves motivated reasoning, in which people accept or reject information depending on whether it conforms to what they'd prefer to believe. Arguably, those with better scientific literacy would be more adept at rejecting some scientific information. The alternative is a sort of anti-Dunning-Kruger, one where actual knowledge leads to a level of confidence that allows people to maintain extreme views. Drummond and Fishchoff also suggest that better general education may make people more aware of which topics have become the subject of a polarizing controversy.

Unfortunately, the study doesn't identify which of these (if any) is a factor. In many ways, the most important things identified in the study may be nanotechnology and GMOs, as these are cases where polarization hasn't occurred, despite ample opportunities. If we can figure out why, it might help us keep future technologies from becoming embroiled in arguments that have little to do with the underlying tech.

PNAS, 2017. DOI: 10.1073/pnas.1704882114 (About DOIs).