Physicist Isidor Isaac Rabi grew up in an immigrant family in New York City in the early 20th century. When he came home from school his mother would not ask him what he learned that day, as his friends’ mothers did. She would ask him, “Did you ask any good questions today?” Apparently Rabi asked many good questions. In 1944, at age 46, he was awarded the Nobel Prize in Physics for developing nuclear magnetic resonance, a technique for probing the atomic nucleus that was later developed into the medical diagnostic technology known as MRI, magnetic resonance imaging.

Questions, not answers, are how science makes progress. Science may appear to serve up answers in its huge textbooks, volumes of encyclopedias, and now online resources. (Is there anything Wikipedia doesn’t know?) And it may seem a pretty impressive collection. But it also makes science appear as a scary, insurmountable mountain of facts, rather than the playground of inquiry it actually is.

Questions, on the other hand, go places, take you down new avenues, generate curiosity and inspiration. They are the critical ingredients to new experiments. Of course, answers are important, but too often they are treated as an end. Think about the word “conclusion.” It is an answer drawn from data, but it can denote the end of the process, of the story, of the adventure. It is at once a determination and a termination. We may hear about the conclusive results in this or that study, or the conclusions to be drawn from this work, but the last thing a scientist wants is a conclusion in the sense of, “there ain’t no more to do.” For all the talk about drawing conclusions in scientific studies, there is relatively little in science that is conclusive.

The contemporary view of science puts too much emphasis on answers. What leads to good science is uncertainty. That doesn’t mean scientists shouldn’t be certain about their findings. It means they should be comfortable that their findings are not the final answer. The poet John Keats, in a letter to his brother in 1817, writes how he was struck by the ideal quality for the literary mind: “Negative Capability—that is when man is capable of being in uncertainties, Mysteries, doubts, without any irritable reaching after fact and reason.” (By the way, that capital M in Mysteries is not a typo, that’s how Keats wrote it.) He considered Shakespeare to be the exemplar of this state of mind, allowing him to inhabit the thoughts and feelings of his characters because his imagination was not hindered by certainty, fact, and mundane reality (think Hamlet).

Science appears as a scary, insurmountable mountain of facts, rather than the playground of inquiry it actually is. Also in Philosophy Ingenious: Mazviita Chirimuuta By Kevin Berger When I asked Mazviita Chirimuuta why philosophers were so crazy about color, she smiled, as if she sensed my question had an implied criticism of philosophy’s penchant for chewing more than it bit off, to paraphrase what a wit once...READ MORE

Negative Capability is just as important to the scientist, who should always find him- or herself in a state of “uncertainty without irritability.” Scientists do reach after fact and reason, but it is often when they are most uncertain that the reaching is the most imaginative, unhindered by a common-sense certainty of how something should work. In a kind of scientific version of Keats, Erwin Schrodinger, one of the great philosopher-scientists, said, “In an honest search for knowledge you quite often have to abide by ignorance for an indefinite period.” Being a scientist requires having faith in uncertainty, finding pleasure in mystery, and learning to cultivate doubt. There is no surer way to screw up an experiment than to be certain of its outcome.

But don’t scientists know a lot of things? They do. But lawyers, engineers, accountants, and electricians know a lot of things. Scientists, however, do something different with what they know. They don’t defend people, or treat people, or make money for people (or, I’m sorry to say, for themselves very often). They make new questions. Facts are not just to be accumulated. They are raw material for making improved, more sophisticated questions with new unknowns. Science, good science, creates as much ignorance as it does knowledge. Thoroughly conscious ignorance is the prelude to every great advance in science. I didn’t make that up—James Clerk Maxwell, the greatest physicist between Newton and Einstein, said it in 1877.

“A Brief Episode of Clarity”: Eve Andrée Laramée. Found laboratory glass etched with text.

Any scientist will tell you that facts are the weakest link in the scientific edifice. They shift and change, regularly. You know that too. One day grapefruit is good for you and the next it can have deadly interactions with common drugs that can cause liver failure. In his recent book, The Half Life of Facts, Samuel Arbesman recounts how the answers to a multiple-choice question on an exam his father took as a medical student remained the same—but the correct answer changed from one year to the next. A fact lasts until the next generation of scientists with the next generation of tools comes along and re-examines the question. The lifetime of a scientific paper can be measured by how long more recent papers continue to cite it as a source. When I was a graduate student some 25 years ago, it was common to cite work from 20 to 30 years earlier in a new manuscript. Now it is considered a bit odd and dated to cite papers more than five years old, with a few exceptions for the “classics.” Facts change, revisions are made, but it adds up to progress. In science, revision is a victory. And that process of revision has accelerated significantly in the last few decades.



For decades, ulcers were thought to be a result of anxiety and poor eating habits. Physicians and the medical establishment treated them with anti-anxiety drugs and bland diets. In the early 1980s, two Australian researchers, Robin Warren and Barry Marshall, showed that ulcers were linked to a bacterium, Helicobacter pylori, colonizing the stomachs of certain individuals. They revealed that ulcers could be cured with a simple regimen of antibiotics. In fact, the connection between ulcers and bacteria had been made as early as 1958, but was ignored, leading to unnecessary suffering, because it didn’t fit the commonly accepted explanation. When the Australian researchers met with resistance, Marshall drank H. pylori, developed a peptic ulcer, and cured himself with antibiotics. Although an extreme strategy, it turned out to be worth it. Marshall and Warren were awarded Nobel prizes for their work in 2005. Sometimes the most difficult task in science is convincing too confident researchers that they don’t know something they are sure of. Stephen Hawking has called the “greatest enemy of knowledge” not ignorance, but “the illusion of knowledge.”

There is no surer way to screw up an experiment than to be certain of its outcome.

This may seem disconcerting. What can we depend on? Facts change, authority is unreliable, viewpoints are modified, consensus dissipates. But it is important to recognize that new facts don’t bring down the whole edifice. Einstein’s theory of relativity didn’t undo Newton’s Principia, it extended it and made it more useful. Newton ferreted out the rules governing the behavior of mass, but it took Einstein to appreciate the atomic mechanisms that made them work.

I’ve seen firsthand how questions sow the seeds of progress. In my field, the neuroscience of olfactory perception—how we smell—a landmark finding was the discovery of a family of neural receptors that capture odors in the air and signal their presence to the brain. It’s great to know that we smell more than 100,000 chemicals using 500 receptors in our noses. But how can the number of receptors be less than half a percent of the odors we detect? That must mean that odor discrimination relies on some combination of receptors, on a code of sorts, where each odor has its particular cohort of receptors and is perceived as the combined activity of those receptors. Let’s say 500 receptors, grouped by combinations of 10, will give you an impossibly large number of unique combinations (the number 2 followed by about 20 zeros). So that’s a nice answer: you can smell hundreds of thousands of different odors by using combinations of a small number of receptors.

But what’s really happened is that because of this simple discovery we now have much better questions to ask. What are the unique combinations? What is the minimum number of receptors required to discriminate one odor from another? How does the brain make sense of this impossibly large number of possible combinations? How about blends of odors? Coffee has over 700 compounds that contribute to its distinctive fragrance. Does this require combinations of combinations? The discovery of odor receptors has kept us hard at work on more and more interesting questions, many of which we couldn’t have even thought of before the receptors were discovered 22 years ago.

Of course, uncertainty in science can be abused and twisted to nefarious purposes. In his recent book, Golden Holocaust, Stanford historian Robert Proctor showed that tobacco companies willfully used claims of insufficient data and incomplete knowledge to block regulation of the sales of tobacco products. Indeed, most of the research showing that tobacco was harmful was paid for by the tobacco companies, with the knowledge that it would be very difficult to find a conclusive (that word again) causal effect between tobacco and cancer. Scientists still don’t know exactly how tobacco products cause cancer, merely that there is an overwhelming and highly predictable correlation between the two. As Proctor showed, tobacco companies persistently strove to keep the public in a state of uncertainty with the claim that more research was necessary.

Parallels with the current debate over the effects of human activity on the world’s climate are obvious. There is little question that human activity is causing the earth’s atmosphere to warm up and that this will lead to changes in climate patterns. The precise nature of those changes, the level of warming that may be acceptable, and the ability to reverse the changes remain unsettled. There are conflicting models, but none of them suggest that anthropogenic warming is not occurring—only what the results of this warming will be and when precisely they will take effect. This uncertainty has given some industry leaders and politicians, with their own special interests, an opening to declare that global warming is not anthropogenic. This is not only disingenuous, it is damaging in the worst way because it creates a wrongheaded notion about science in the public mind.

Unsettled science is not unsound science. Scientists tend to emphasize disagreements because this is where the work remains to be done. Why talk about what we know, when all our effort should be directed at what we don’t know? The highly accomplished Marie Curie, in a letter to her brother, noted that “one never thinks about what has been done, only what remains to be done.” Problems don’t get solved by sitting around and nodding in agreement. They are solved, indeed they are understood to be problems in the first place, by talking about them.

Parallels with the current debate over the effects of human activity on the world’s climate are obvious.

Today, the public wants more of a say in science than ever before, which is understandable, since science affects so much of our lives. Climate change, genetically modified food, nuclear energy, rapid spread of infectious diseases, and a host of never-before-seen possibilities—both good and bad—have been illuminated by science.

But short of becoming an expert in each of many disparate fields, unlikely for even the cleverest among us, how can we participate? Well, we can be more like scientists in one crucial area: the acceptance of uncertainty. Indeed, it is the too-well-crafted explanation, the one that explains everything, that should set off red flags, warning us that we are likely being deceived, misled, or outright duped.

I’m a neurobiologist, but I don’t know any more about quantum physics than any other non-physicist, nor about computability limits than anyone without a degree in computer science, nor about a thousand other things outside my narrow expertise. But as a scientist, I know the value of doubt and the danger of certainty. In science, dumb and ignorant are not the same thing.

To be realistically engaged with science means appreciating doubt and uncertainty as the necessary precursor to knowledge and illumination. We must learn to traffic in the unknown, be comfortable with uncertainty, take pleasure in mystery. While searching for knowledge we must abide by ignorance for an indefinite period. Above all, as Mrs. Rabi knew more than 100 years ago, we need to know how to ask a good question.





Stuart Firestein is a professor of neuroscience in the Department of Biological Sciences at Columbia University. He is a fellow of the American Association for the Advancement of Science, a Guggenheim Fellow, and serves as an advisor to the Alfred P. Sloan Foundation. This essay is adapted from his 2012 book, Ignorance: How it Drives Science (Oxford University Press).