"Can scientists and journalists learn to beat the doubt industry before our most serious problems beat us all?" This is the question asked in an interesting piece at a news? site I've never heard of - "Miller-McCune: Turning Research Into Solutions." I'm not sure about the research and solutions thing, but they do have some interesting comments about the "Doubt Industy." "Doubt Industry" means organized interests with a strong motivation to get the public to question science: the link between smoking and cancer, the scientific status of evolution, the health hazards of beryllium (apparently a problem for workers in the atomic industry). Whenever scientific results cause problems for someone, especially someone with a strong financial incentive to not believe the science, the time-tested strategy is to question the certainty of the science. A well-known internal memo from one cigarette company in the 1960's famously claimed (PDF): "In thinking over what we might do to improve the case for cigarettes, I have looked at the problem somewhat like the marketing of a new brand...Our consumer I have defined as the mass public, [and] our product as doubt..." That line is the title of a new book by a former Department of Energy Scientist: Doubt is Their Product: How Industry's Assault on Science Threatens Your Health ." The Miller-McClune Piece interviews the author and reviews the book. I haven't read the book and have nothing to say about it, but the strategy of sowing doubt about science raises an interesting issue. A lot of times scienceuncertain, and we're are occasionally forced to make policy decisions based on incomplete data. But in other cases we are not, the science is solid, yet how is the public supposed to know the difference? Or let's make this even more personal: how aresupposed to know the difference between a settled scientific issue and one that is still debated? Obviously none of us is an expert in every technical field, and even if we were capable of that, nobody can keep up with all that primary literature. At some point, you have to take somebody's word for it. Personally, I'd like to listen to somebody who's likely to be right, whether it fits with my personal ideology or not. Sometimes the science is relatively simple - like the link between smoking and cancer; it's hard to find a stronger link between something and cancer than that, unless it's exposure to radioactivity. But usually we're not so lucky - take climate change, something which is so fiendishly complicated, resting on multiple different types of evidence from different fields, as well as computer models, which can be great, or they can be terrible, but it's hard to know without immersing yourself in the details. The issue gets worse when you get cargo-cult science involved: imitators, who try their best to look scientific, using jargon, hosting technical conferences, and basically trying to get the public to get to think that they are doing real science. What these imitators are missing (and it's not always limited to imitators, it happens in real science as well) is, as Richard Feynman put it: "A specific, extra type of integrity that is not lying, but bending over backwards to show how you are maybe wrong, that you ought to have when acting as a scientist." That's the kind of person you want to trust. Unfortunately, that kind of attitude doesn't help the uncertainty issue. But there is hope: good scientists who speak to the public should be frank about the shortcomings and possible alternate interpretations of their work, but when they are confident about a result, they should say so forcefully and explain why alternate ideas have failed on the evidence. What I'm really saying is that I have no clue how to solve the doubt problem. I look for scientists who have the self-confidence to recognize a solid result when they see one, but who also bend over backwards to make sure they are not fooling themselves. In today's current media climate, that kind of thing is hard to find.