Several years ago, my student and I studied anonymous email server data from 40,000 students, faculty and staff at a large university. Our research plan was reviewed and approved by our institutional review board. Nobody got identified, and nobody got hurt. And we made several new contributions to the nascent science of networks, shedding light on the "rules" by which friendship networks evolve over time.

Then our research paper came out in the journal Science. Much to our surprise, we were instantly accused of treating innocent people like bacteria, scrutinizing behavior as if in a "digital petri dish". Even if we didn't know whose data was whose, the critics insisted that individuals were still "in there" somewhere – and that they found our research downright creepy.

If this episode sounds familiar, well, yes: the now infamous "Facebook study" also involved data collected without informed consent, was also approved under the same human-subjects regulations as ours, and was also published in a prestigious scientific journal. And just like ours, it was immediately beset with outrage.

As the authors and even the editor of Proceedings of the National Academy of Sciences have now acknowledged, the Facebook study could have been handled better, and there are certainly lessons to be learned. But it's important in the aftermath to keep in mind that transitional moments in science always breed new anxieties.

In the late 18th and early 19th century, the world went through a period of rapid scientific discovery. Previously mythical continents like Australia and Africa were explored for the first time, and thousands of new species as well as new peoples were discovered. New and more powerful telescopes pushed back the boundaries of the known universe, and chemistry finally broke the shackles of alchemy to yield near miraculous breakthroughs.

Nonetheless, religious authorities fretted about the incursion of science into the realm of god. Authors like Mary Shelley raised fears that scientists would unleash forces beyond their control. And poets worried that uncovering the rules somehow depleted the world of wonder. Keats opined that Newtown had "destroyed all the poetry of the rainbow, by reducing it to a prism." Wordsworth lamented: "Our meddling intellect Misshapes the beauteous forms of things. We murder to dissect."

Yes, the arrival of new ways to understand the world can be unsettling. But as social science starts going through the kind of revolution that astronomy and chemistry went through 200 years ago, we should resist the urge to attack the pursuit of knowledge for knowledge's sake.

Just as in the Romantic era, advances in technology are now allowing us to measure the previously unmeasurable – then distant galaxies, now networks of millions of people. Just as then, the scientific method is being promoted as an improvement over traditional practices based on intuition and personal experience. And just as then, defenders of the status quo object that data and experiments are inherently untrustworthy, or are simply incapable of capturing what really matters.

We need to have these debates, and let reasonable people disagree. But it's unreasonable to insist that the behavior of humans and societies is somehow an illegitimate subject for the scientific method. Now that the choice between ignorance and understanding is within our power to make, we should follow the lead of the Romantics and choose understanding.

Remember: the initial trigger for the outrage over the Facebook study was that it manipulated the emotions of users. But we are being manipulated without our knowledge or consent all the time – by advertisers, marketers, politicians – and we all just accept that as a part of life. The only difference between the Facebook study and everyday life is that the researchers were trying to understand the effect of that manipulation.

If that still sounds creepy, ask yourself this: Would you prefer a world in which we are having our emotions manipulated, but where the manipulators ignore the consequences of their own actions? What about if the manipulators know exactly what they're doing ... but don't tell anyone about it? Is that really a world you want to live in?

If anything, we should insist that companies like Facebook – and governments for that matter – perform and publish research on the effects of the decisions they're already making on our behalf. Now that it's possible, it would be unethical not to. And it would be disastrous if a poorly informed outcry over a single study had the effect of driving them in the opposite direction – either to willful ignorance or to secrecy.

Social science is less about trendy topics like email networks and social media sites than it is about the grand problems of society – managing the impact of climate change, regulating financial markets, building more efficient organizations, designing more fair economies. Yes, all research needs to be conducted ethically, and social scientists have an obligation to earn and keep the public trust. But unless the public truly prefers a world in which nobody knows anything, more and better science is the best answer we have.