For indispensable reporting on the coronavirus crisis, the election, and more, subscribe to the Mother Jones Daily newsletter.





In science, the unwritten rule has always been to publish your results first and worry about the fallout later. More knowledge is always good, right? Information wants to be free.

But what if the thing you want to publish is truly frightening? Millions dead kind of frightening.

This isn’t a rhetorical question, in light of some experiments now in the pipeline for publication. H5N1 influenza viruses—a.k.a Avian flu—are efficient killers that have wiped out some poultry flocks and a few hundred hapless people who were in close contact with the birds. (New Scientist reports that 565 people are known to have caught the bird flu and 331 died.) But at Erasmus Medical Center in Rotterdam, the Netherlands, virologist Ron Fouchier has created an Avian flu that, unlike other H5N1 strains, easily spreads between ferrets—which have so far proven a reliable model for determining transmissibility in humans. What’s more, his breakthrough, funded by the National Institutes of Health, involved relatively low-tech methods.

Are you scared yet? You have reason to be. In the December 2 issue of Science magazine, Fouchier admits that his creation “is probably one of the most dangerous viruses you can make,” while Paul Keim, a scientist who works on anthrax, adds, “I can’t think of another pathogenic organism that is as scary as this one.” (Here’s a summary; you’ll need a subscription to read the full text, even though you probably paid for it alread.)

Now Fouchier hopes to publish the results of experiments—first announced in September at a meeting of flu researchers in Malta—that many scientists believe should never have been done in the first place. He and Yoshihiro Kawaoka, a virologist at the University of Wisconsin who is reportedly seeking to publish a similar study, have long pursued this line of research, hoping to determine whether H5N1 has the potential to become infectious in people, a jump that could trigger a worldwide pandemic. Knowing the specific genetic mutations that make the virus transmissible, Fouchier told Science, will help researchers respond quickly if this sort of killer virus were to emerge in nature.

This type of research is euphemistically known as “dual-use,” which means it could be used for good or evil. Publishing such work is a “risk-benefit calculation,” Donald Kennedy, then editor-in-chief of Science, told me for a story published on the first anniversary of 9/11. Science, Kennedy said, had never rejected an article out of concern that the information could be misused, although, he added, “I suppose one could conceive of a scenario in which one would decline to publish.”

“If I were a journal editor and I received an article that said how to make a bioweapon, I’d never publish it, but that would be based on self-regulation, not any government restriction.”

“If I were a journal editor and I received an article that said how to make a bioweapon, I’d never publish it, but that would be based on self-regulation, not any government restriction,” added bioterror expert and retired Harvard professor Matt Meselson. “I’ve never heard of a case where the government has restricted publication. I don’t think it would work.”

Kawaoka, whose lab has also published methods for reconstituting a pathogenic virus from its DNA sequence, didn’t respond to Science, but when I talked to him back in 2002, he was adamant that dual-use data should be published. He argued that even recipes for nuclear weapons exist online, and that once you start censoring potentially dangerous results, you may as well ban knives and guns and even airplanes—the terrorists’ weapon of choice the previous September.

What most troubles critics of Fouchier’s experiments was the lack of any meaningful review before they were conducted. Some scientists think that any work this dangerous should be vetted by an international panel; others reject the notion, fearing that such a move would create an unacceptable bottleneck in the flow of scientific information.

Back in 2002, I also spoke with Brian Mahy, a virologist with the Centers for Disease Control and part of the team that had sequenced smallpox and several other highly dangerous pathogens in the early 1990s. Toward the end of the smallpox project, Mahy told me, the team had internal debates about whether to go public with the sequences. “My view is it was scientific evidence that needed to be in the public domain, and we’re a public institution, so we published it,” he said. “There were suggestions it be burned onto a CD-ROM and chained to [then-CDC chief] Bernadine Healy’s desk.”

But such decisions, then and now, have been left largely in the hands of the researchers. The U.S. National Science Advisory Board for Biosecurity, an NIH advisory panel, is currently reviewing the Fouchier and Kawaoka papers, according to Science. But in 2007, the board recommended against mandating prior reviews of dual-use research. Instead, it suggested that scientists alert their institutional review boards to any experiments of concern—something they were supposed to be doing already. Keim, who sits on the NSABB, told Science that any potential risks should be flagged at “the very first glimmer of an experiment…You shouldn’t wait until you have submitted a paper before you decide it’s dangerous.”

These particular experiments, it’s safe to say, were exceedingly strong candidates for scrutiny.

UPDATE (Dec. 20, 2011): US officials are asking both teams of flu researchers to withhold certain key details from their published findings. The journals in question appear willing to comply with this unprecedented request, so long as they can ensure that qualified researchers can have access to the full data.

Update (Feb. 17, 2012): It now appears that the work will be published without redaction. A World Health Organization panel has reached a “strong consensus” on the topic—though not a unanimous one, as virologist Anthony Fauci told the New York Times. But parts of the WHO consensus document are suspect, in my opinion: “The group recognized the difficulty of rapidly creating and regulating such a mechanism in light of the complexity of international and national legislation,” it concludes. “A consensus was reached that the redaction option is not viable to deal with the two papers under discussion in view of the urgency of the above mentioned public health needs. The participants noted there may be a need for such a mechanism in the future.”

They seem to be inferring that this natural virus, which has been around for quite a while now, is so likely to acquire the five distinctive mutations it needs to jump between mammals that we must rush to publish a recipe for it, rather than take the time to devise a system to safeguard the information. I don’t buy it. The final line above seems laughable: Nope, don’t need to worry about this one. But maybe some other, even more deadly plague will come along one day, requiring us to set up such a system. I’m not a public health expert, but this doesn’t pass the smell test.