Tomorrow, April 22nd—Earth Day—scientists, people who love science, and citizens concerned that government policies are increasingly detached from empirical reality will march in Washington, D.C., and nearly five hundred other cities around the world. Like any large group of protesters, the science marchers have struggled to craft their mission statement. “The March for Science champions robustly funded and publicly communicated science as a pillar of human freedom and prosperity,” they write, on their Web site. “We unite as a diverse, nonpartisan group to call for science that upholds the common good and for political leaders and policy makers to enact evidence based policies in the public interest.”

It’s a good mission statement, but it contains a time bomb. The phrase “science that upholds the common good” may seem innocuous. But who decides what is in the common good?

The March for Science is taking place alongside an ongoing debate, in the House, stoked by Lamar Smith, the chairman of the House Science Committee. Smith is an active critic of the scientific evidence for human-induced climate change. He argues that the National Science Foundation’s charter statement, which charges it with supporting research “in the national interest,” means that it should fund less social science and environmental research, which he views as of little value.

Statements about “the common good” or “the national interest” are inherently political. They should guide politicians in formulating public policy, and their interpretation should be clearly enunciated as the basis of the electoral process. But are they appropriate for governing scientific research? If history is any guide—from the Soviet misdirection of research in genetics, to Nazi claims about “Jewish science,” to the more recent effort to restrict research on climate change—claims about “the national interest” often hinder science in its pursuit of its most important goal: the unfettered and open questioning of even our most fundamental assumptions about reality.

It’s true that the progress of science has brought unparalleled prosperity and wealth to a significant fraction of the world’s population, and that it has created technologies that have changed everything about the way we live and work, while enhancing human health and lifespan well beyond what it was even a century ago. These developments have been for “the common good.” But it has also brought us nuclear and chemical weapons, the dark side of the Internet, and, for some people, a sense of isolation from nature.

It’s tempting to locate the utility of science in the technology it produces, and in the way that technology improves the human condition. But the truth is that science is not equivalent to technology alone—nor is it equivalent to a set of facts, which can then be contrasted with another set of “alternative” facts to decide which set one prefers.

Science is a process for deriving facts about nature. It’s a process for enhancing our understanding of the world around us, and for separating nonsense from sense via empirical investigation, logical reasoning, and constant testing. Trying to define science as an activity that upholds “the common good” or is “in the national interest” distorts the fact that science is nothing more or less than a remarkably successful empirical process for uncovering the way the world works. At its best, this process is open-ended and curiosity-driven.

Of course, there are many specific public-policy issues that cry out for scientific research, from curing diseases to safeguarding the environment. But, as the National Academies of Sciences, Engineering, and Medicine stressed in its landmark 2007 report, “Rising Above the Gathering Storm,” written by a distinguished committee chaired by the former Lockheed Martin C.E.O. Norman Augustine, perhaps half of the current gross national product of the United States relies on the results of curiosity-driven fundamental research performed twenty-five to fifty years ago. The pursuit of science for its own sake led to discoveries such as penicillin and superconductivity, and it often created, almost by accident, remarkable benefits for society. Consider the creation of the World Wide Web, by groups working at the European Organization for Nuclear Research (CERN), which built the Large Hadron Collider. It was originally designed to help scientists working at different groups across the globe access information, but then expanded to change the life of almost every citizen in the industrialized world today.

And yet, as important as these economic and technological spinoffs of science are, knowledge, in itself, is still at the center of the scientific enterprise. In this respect, perhaps the greatest benefit of science for society is how it transforms our culture. Science provides us with a new perspective on our place in the cosmos and a better understanding of ourselves as human beings. It helps us overcome our otherwise myopic preconceptions about how the world works. At a deep level, it allows us to see through some of our illusions about reality, which result from the peculiarities of space and time within which we happen to exist, and to perceive, instead, the detailed, fundamental workings of nature.

In these aspects, science resembles those other human activities, like art, music, and literature, that distinguish humanity as a species. We don’t—or shouldn’t—ask what the utility of a play by Shakespeare is, or how a Mozart concerto or a Rolling Stones song upholds “the common good,” or how a Picasso painting or a movie like “Citizen Kane” might be in “the national interest.” (Perhaps it’s because we insist on thinking in such terms that support for art, music, and literature is also under attack in Congress.) The free inquiry and creative activity we find in science and art reflect the best about what it means to be human.

In 1969, Robert Wilson, the first director of the Fermi National Accelerator Laboratory, near Chicago, was asked by Congress whether the huge particle accelerator being built there would contribute to “the national defense.” His response then is appropriate now:

No, sir. . . . I don’t believe so. . . . It has only to do with the respect with which we regard one another, the dignity of men, our love of culture. . . . It has to do with, are we good painters, good sculptors, great poets? I mean all the things we really venerate in our country and are patriotic about. It has nothing to do directly with defending our country except to make it worth defending.

The March for Science can meaningfully celebrate the ways in which the process of science enhances our lives, and it can usefully demand that the government pursue evidence-based public policy. It’s certainly true that Congress should use the knowledge developed by free inquiry to assist in developing policies to promote “the common good,” as the electorate conceives of it. But the standard of “the common good” should not be the one by which science is judged, because such a standard risks politicizing what is inherently apolitical. The March for Science must be clear-eyed in its defense of the scientific process as an independently valuable human activity. It should defend the core value of the scientific process: discovering more about the universe, and ourselves.