The "politics & science" webinar the other day was a lot of fun. Unfortunately, there wasn't time to answer all the great questions that audience members had.

So here are some additional responses to some of the questions that were still in the queue:

Q1. How do you reconcile the fact that left-wing/educated individuals accept scientific evidence about climate change yet reject vaccinations?

Q2. Have you looked at GMOs or vaccines and seen similar results from the left that you've seen on the right?

I put these two together b/c my answer to the 1st is based on the 2d.

click me!There’s no need to “reconcile the fact that left-wing/educated individuals accept scientific evidence about climate change yet reject vaccinations” b/c it’s not true!

Same for the claim that GM foods are somehow connected to a left-leaning political orientation--or a right-wing leaning one, for that matter.

The media & blogosophere grossly overstate the number of risk issues on which we see the sort of polarization that we do on climate change along with a number of other issues (e.g., fracking, nuclear power, HPV vaccine [at least at one time; not sure anymore]).

Consider these respones form a large, nationally represenative sample, surveyed last summer:

I call the survey item here the “industrial strength risk perception measure” (ISRPM). There’s lots of research showing that responses to ISRPM will correlate super highly with respones that people give to more specific questions about the identified risk sources (e.g., “is the earth heating up?” or “are humans causing global temperatures to rise” in the case of the “Global warming” ISRPM) and even to behavior with respect to personal risk-taking (at least if the putative risk source is one they are familiar with). So it’s an economical way to look at variance.

You can see that climate change, fracking, and guns are pretty unusual in generating partisan divisions (click for higher res).

Well, here’s childhood vaccines and GM foods:

Definitely not in the class of issues—the small, weird ones, really—that polarize people.

A couple of other things.

First, to put the very tiny influence of political orientations on vaccine risks (and even smaller one on GM foods) in perspective, consider this (from a CCP report on vaccine risk perceptions):

Anyone who sees how tiny these correlations are and still wants to say that the there is an meaningful connection between partisanship and either vaccine- or GM food-risk perceptions is making a ridiculous assertion.

Indeed, in my view, they are just piling on in an ugly, ignorant, illiberal form of status competition that degrades public science discourse

Second, GM food's ISRPM is higher than that of many other risk sources, it’s true. But that’s consistent with noise: people are all over the map when they respond to the question, and so the average ends up around the middle.

In fact, there’s no meaningful public concern about GM food risks in the general population—for the simple reason that most people have no idea what GM foods are. Serious public opinion surveys show this over & over.

Nonserious ones ignore this & pretend that we can draw inferences from the fact that when people who don’t know what GM foods are are asked if they are worried about them, they say, “oh yes!” They also say ridiculous things like that that they carefully check for GM ingridients when they shop at the supermarket, even though in fact there aren’t any general GM food abeling requirements in the US.

Some 80% of the foods in US supermarkets have GM ingridients. People don’t fear GM foods; they eat them, in prodigious amounts.

It’s worth trying to figure out both why so many people have the misimpression that both GM foods and vaccines are matters of significant concern for any meaningful segment of the US population. The answer, I think, is a combination of bad reporting in the media and selective sampling on the part of those who are very interested in these issues & who immerse themselves in the internet enclaves where these issues are being actively debated.

There are serious dangers, moreover, from the exaggeration of the general concern over these risks and the gross misconceptions people have about the partisan character of them.

Some sources to consider in that regard:

Cultural Cognition Project Lab. Vaccine Risk Perceptions and Ad Hoc Risk Communication: An Emprical Analysis. CCP Risk Studies Report No. 17

Kahan, D.M. A risky science communication environment for vaccines. Science 342, 53-54 (2013).

Kahan, D., Braman, D., Cohen, G., Gastil, J. & Slovic, P. Who fears the HPV vaccine, who doesn’t, and why? An experimental study of the mechanisms of cultural cognition. Law Human Behav 34, 501-516 (2010).

Q3. I'd like to ask both speakers about the need for science literacy. How does increasing science literacy - that is, knowledge about the scientific process – serve to influence people’s beliefs about science issues?

Where the sorts of dynamics that generate polarization exist, greater science comprehension (measured in any variety of ways, including standard science literacy assessments, numeracy tests, and critical reasoning scales) magnifies polarization. The most science comprehending members of the population are the most polarized on issues like climate chagne, fracking, guns, etc.

Consider:

Here I’ve plotted in relation to science comprehension (measured with a scale that includes basic science knowledge along with various critical reasoning dispositions) the ISRPM scores of individuals identified by political outlook.

As mentioned above, partisan polarization on risk issues is the exception, not the rule.

But where it exists, it gets worse as people become better at making sense of scientific evidence.

Why?

B/c now and again, for one reason or another, disputes that admit of scientific inquiry become entantled in antagonistic cultural meanings. When that happens, positions on them beceome badges of membership in and loyalty to cultural groups.

At that point, individuals’ personal stake in protecting their status in their group wil exceed their personal stake in “getting the right answer.” Accordingly, they will then use their intelligence to form and persist in the positions that signify their group membership.

The entanglement of group identity in risks and other facts that admit of scientific investigation is a kind of pollution in the science communication environment. It disables the faculties that people normally use with great success to figure out what is known by science.

Improving science literacy won’t, unfortunately, clean up our science communciation environment.

On the contrary, we need to clean up our science communication environment so that we can get the full value of the science literacy that our citizens possess.

Some sources:

Kahan, D.M. Climate-Science Communication and the Measurement Problem. Advances in Political Psychology 36, 1-43 (2015).

Kahan, D.M., Peters, E., Dawson, E. & Slovic, P. Motivated Numeracy and Englightened Self Government. Cultural Cognition Project Working Paper No. 116 (2013).

Kahan, D.M. Ideology, Motivated Reasoning, and Cognitive Reflection. Judgment and Decision Making8, 407-424 (2013).

Kahan, D.M. “Ordinary Science Intelligence”: A Science Comprehension Measure for Use in the Study of Science Communication, with Notes on 'Belief in' Evolution and Climate Change. CCP Working Paper No. 112 (2014).

Kahan, D.M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L.L., Braman, D. & Mandel, G. The polarizing impact of science literacy and numeracy on perceived climate change risks. Nature Climate Change 2, 732-735 (2012).

Kahan, D. Why we are poles apart on climate change. Nature 488, 255 (2012).