In one CCP study, we found that cultural polarization over climate change is magnified by science literacy (numeracy, too). That is, as culturally diverse members (but perfectly ordinary, and not particularly partisan) members of the public become more science literate, they don't converge on the dangers that global warming poses but rather grow even more divided.

Not what you'd expect if you thought that the source of the climate change controversy was a deficit in the public's ability to comprehend science.

But the culturally polarizing effect of science literacy isn't actually that unusual. It's definitely not the case that all risk issues generate cultural polarization. But among those that do, division is often most intense among members of the public who are the most knowledgeable about science in general.

Actually, in the paper in which we reported the culturally polarizing effect of science literacy with respect to perceptions of climate change risks, we also reported data that showed the same phenomenon occurring with respect to perceptions of nuclear power risks.

Well, here are some more data that help to illustrate the relationship between science literacy and cultural polarization. They come from a survey of a nationally representative sample of 2000 persons conducted in May and June of this year (that's right--even more fresh data! Mmmmmm mmmm!)







These figures illustrate how public perceptions of different risks vary in relation to science literacy. Risk perceptions were measured with the "industrial strength measure." Science literacy was assessed with the National Science Foundation's "Science Indicators," a battery of questions commonly used to measure general factual and conceptual knowledge about science.

For each risk, I plotted (using a locally weighted regression smoother, a great device for conveying the profile of the raw data) the relationship between risk perception and science literacy for the sample as a whole (the dashed grey line) and the relationships between them for the cultural groups (whose members are identified based on their scores in relations to the means on the hierarchy-egalitarian and individualist-communitarian worldview scales) that are most polarized on the indicated risk

The upper-left panel essentially reproduces the pattern we observed and reported on in our Nature Climate Change study. Overall, science literacy has essentially impact on climate-change risk perceptions. But among egalitarian communitarians and hierarch individualists--the cultural groups who tend to agree most strongly on environmental and technological risks--science literacy has off-setting effects with respect to climate change and fracking: it makes egalitarian communitarians credit assertions of risk more, and hierarchical individualists less.

The same basic story applies to the bottom two panels. Those ones look at legalization of marijuana and legalization of prostitution, "social deviancy risks" of the sort that tend to divide hierarchical communitarians and egalitarian individualists.

Neither the level of concern nor the degree of cultural polarization is as intense as those associated with global warming and fracking. But the intensity of cultural disagreement does intensify with increasing science literacy (it seems to abate for legalization of prostitution among those highest in science litercy, although the appearance of convergence would have to be statistically interrogated before one could conclude that it is genuine).

What to make of this? Well, again, one interpretation --one supported by the study of cultural cognition generally--is that the source of cultural polarization over risk isn't plausibly attributed to a deficit in the public's knowledge or ability to comprehend science.

Instead, it's caused by antagonistic cultural meanings that become attached to particular risks (and related facts), converting them into badges of membership in and loyalty to important affinity groups.

When that happens, the stake individuals have in maintaining their standing in their group will tend to dominate the stake they have in forming "accurate" understandings of the scientific evidence: mistakes on the latter won't increase their or anyone else's level of risk (ordinary individual's opinions are not of sufficient consequence to incrase or diminish the effects of climate change, etc); whereas being out of line with one's group can have huge, and hugely negative, consequences for people socially.

Ordinary individuals will thus attend to information about the risks in question (including, e.g., the position of "expert" scientists) in patterns that enable them to persist in the holding beliefs congruent with their cultural identities. Individuals who enjoy a higher than average capacity to understand such information won't be immune to this effect; on the contrary, they will use their higher levels of knowledge and analytic skills to ferret out identity-supportive bits of information and defend them from attack, and thus form perceptions of risk that are even more reliably aligned with those that are characteristic of their groups.

That was the argument we made about climate change and science comprehension in our Nature Nanotechnology study. And I think it generalizes to other culturally contested risks.

But not all socieal risks are contested. The number that are characterized by culturally antagonistic meaning is, as I've stressed before, quite small in relation to the number that generate intense cleavages of the sort that characterize climate change, nuclear power, gun control, the HPV vaccine, and (apparently now) fracking.

With respect to those issues, we shouldn't expect to see polarization generally. Nor should we expect to see it among those culturally diverse individuals who are highest in science literacy or in other qualities that reflect a higher capacity to comprehend quantitative information.

On the contrary, we should expect such individuals to be even more likely to be converging on the best scientific evidence. They might be better able to understand such evidence themselves than people whose comprehension of science is more modest.

But more realistically, I'd say, the reason to expect more convergence among the most science literate, most numerate, and most cognitively reflective citizens is that they are more reliably able to discern who knows what about what.

The amount of decision-relevant science that it is valuable for citizens to make use of in their lives far exceeds the amount that they could hope to form a meaningful understanding of. Their ability to make use of such information, then, depends on the ability of people to recognize who knows what about what (even scientists need to be able to employ this form of perception and recognition for them to engage in collaborative production of knowledge within their fields).

Ordinary individuals--ones without advanced degrees in science etc. -- are ordinarily able to recognize who knows what about what without difficulty, but one would expect that those who have a refined capacity to comprehend scientific information would likely do even better.

It's the degrading or disrupting effect on this recognition capacity on citizens of ordinary and extraordinary science comprehension capacities that makes risks suffused with antagonistic meanings a source of persistent cultural dispute.

Okay, all of that is a matter of surmise and conjecture. How about some data on the impact of science literacy on less polarizing issues.

I have to admit that I'm not as systematic as I should be -- as I think it is important for all who are studying the "science communication problem" to be -- in studying "ordinary," "boring," nonpolarizing risks.

But consider this:

Here we see the impact of science literacy, generally and with respect to the cutural groiups (this time egalitarian communitarians and hierarch individualists) who are most "divided," on GM foods and childhood vaccination.

In fact, the division is exceedingly modest. I think, in fact, to characterize the levels of disagreement seen here as reflecting "cultural polarization" would be extravagant. As I've emphasized before, I see little evidence -- as opposed to casual assertions by commentators who I think should be more careful not to confuse agitation among subsegments of the population who are disposed to dramatic, noisy gestures but who are actually very small and quite remote from the attention of the ordinary, nonpolitical member of the public--that these are culturally polarizing issues in the U.S., at least for the time being.

Moreover,with respect to both issues, science literacy tends in general and among the cultural groups whose members are modestly divided to reduce concern about risk (again, a little "blip" like the one at the extreme science-literacy end of "egalitarian communitarians" in the fracking graph is almost certainly just noise-- statistically speaking; if we could find the one or two responsible survey respondents, they might in fact be unrepresentatively noisy on this issue).

That's not "smoking gun" evidence that science literacy tends to improve the public's use of decision-relevant science on societal risks for nonpolarizing issues.

For that, it would be useful to have more evidence of public opinion, on risks that provoke even less division and on which the evidence is very very clear (it is on vaccines; I am inclined, too, to believe that the evidence on GM foods suggests they pose exceedingly little risk and in fact offset myriad others, from ones associated with malnutrition to crop failure induced by climate-- but I feel I know less here than I do about vaccines and am less confident).

But the "picture" of how science literacy influences public opinion vaccines and GM foods-- two risk issues that aren't genuinely culturally polarizing -- is strikingly different from the one we see when we look at issues like climate change, or nuclear power, or fracking, where the toxic fog of antagonistic meanings clearly does impede ordinary citizens' ability to see who knows what about what.

Science comprehension -- knowledge of important scientific information but even more important the habits of mind that make it possible to know things in the way science knows them -- is intrinsically valuable. Even if this capacity in citizens didn't make them better consumers of decision-relevant science, a good society would dedicate itself to propagating it as widely as possible in its citizens because in fact the ability to think is a primary human good.

But who could possibly doubt that science comprehension -- the greatest amount of it, dispersed as widely as possible among the populace -- wouldn't make it more likely that the value of decision-relevant science would be realized by ordinary people in their lives as individuals and as citizens of a democracy? I certainly wouldn't question that!

The polarizing effect of science literacy on culturally contested issues like climate change is not evidence that popular science comprehension lacks value.

On the contrary, it is merely additional evidence of how damaging a polluted science-communication environment is for the welfare of the diverse citizenry of the Liberal Republic of Science.