In an unprecedented move, the U.S. government asked scientific journals not to publish the details of experiments on the deadly H5N1, for fear that the information could be used with malice. Is such censorship smart?

Kallista / Getty Images Colorized transmission electron micrograph of avian flu A H5N1 viruses, seen in brown

H5N1 avian flu rarely infects humans, but it is deadly when it does. Since the virus first emerged in humans in Hong Kong in 1997, nearly 600 people have been infected worldwide and almost 60% have died.

The virus isn’t very transmissible, but scientists have long worried that it might mutate, perhaps through reassortment with a human flu strain, and gain the ability to pass easily from person to person like human flus, such as the H1N1/A strain that triggered a pandemic in 2009. More than a decade since its emergence in humans, however, that fear has yet to come true, and H5N1 remains only an occasional threat for the rare person who contracts it — usually from close contact with a sick bird.

If H5N1 gained the ability to spread virulently, we might face another world-changing virus like the 1918 flu, but so far, at least, we’ve been lucky.

But just because nature hasn’t figured out a way to create an easily transmissible H5N1 doesn’t mean that scientists can’t. In experiments conducted at the University of Wisconsin in Madison and Erasmus University in Rotterdam, the Netherlands, researchers engineered a strain of H5N1 that spread easily between ferrets — which means it can probably spread easily between people. (Ferrets are a commonly used animal model for studying human flu.)

It’s not clear how the scientists did it — most of the information has been coming out piecemeal in scientific presentations and interviews since September. The next logical step would be for the researchers to publish studies in major scientific journals, describing the newly created flu, including its genetic makeup. And that would mean that anyone with the proper scientific training — from another researcher to a terrorist — would likely be able to read the studies and potentially make the new H5N1 themselves.

Cognizant of that risk, on Tuesday the U.S. government did an unprecedented thing: it asked scientific journals not to publish the details of the H5N1 experiments, for fear that the information could fall into the wrong hands and be used to create a bioweapon. But while it seems likely that the two journals in question — Science and Nature — will hold off from publishing the ingredient list for the super-H5N1, they will almost certainly release papers on the studies and their conclusions. And that alone might be enough for a dedicated reader to figure out the recipe for themselves.

In a post on Tuesday, writer Laurie Garrett — whose piece in Foreign Policy on Dec. 15 was the deepest look yet into the man-made-H5N1 controversy — laid out possible options for the National Science Advisory Board for Biosecurity (NSABB), the government panel that helps oversee such research:

The NSABB faced three basic options regarding publication of papers by Ron Fouchier of Erasmus University in Rotterdam and Yoshi Kawaoke of the University of Wisconsin in Madison: 1.) Advise all credible scientific publications to decline release of the papers, essentially censoring the work; 2.) Allow full and free publication of both papers; 3.) Advise publication, but with key passages related to how the feats were performed, deleted. The NSABB essentially opted for #3, suggesting to Science, Nature and other major journals that they agree to publish the two studies, but omit some of the materials and methods sections, allowing scientists to know what was done, but not how.

Here’s how the NSABB explained its decision:

Following its review, the NSABB decided to recommend that HHS [Department of Health and Human Services] ask the authors of the reports and the editors of the journals that were considering publishing the reports to make changes in the manuscripts. Due to the importance of the findings to the public health and research communities, the NSABB recommended that the general conclusions highlighting the novel outcome be published, but that the manuscripts not include the methodological and other details that could enable replication of the experiments by those who would seek to do harm.

It might seem crazy that researchers are even carrying out such experiments, given the risk — however small — that an engineered and efficient H5N1 could escape the lab and trigger a pandemic out of a Stephen King novel. But as the NSABB acknowledged, such studies are important for helping scientists better understand a virus, which then helps them better understand how to fight it.

Publishing redacted studies could be problematic, hiding just enough data to make the research difficult for other scientists to use, while not holding back enough to keep a dedicated bioterrorist from deducing the ingredients to the superflu. Science editor Bruce Alberts said in a statement on Tuesday:

The NSABB has emphasized the need to prevent the details of the research from falling into the wrong hands. We strongly support the work of the NSABB and the importance of its mission for advancing science to serve society. At the same time, however, Science has concerns about withholding potentially important public-health information from responsible influenza researchers. Many scientists within the influenza community have a bona fide need to know the details of this research in order to protect the public, especially if they currently are working with related strains of the virus.

It’s important to understand how unprecedented this is situation is. Scientific journals — indeed, scientists themselves — are dedicated to the idea of the free and open dissemination of research through a peer-reviewed publication system. It is the bedrock of modern science, and published papers, especially in high-profile journals like Science and Nature, are the coin of the academic realm. No journal would agree to censor a study without good reason, as Garrett wrote in her Foreign Policy piece:

If these scientists have indeed used the techniques that they have verbally described (but not yet published) to produce a highly contagious and virulent form of the so-called “bird flu,” the feat can at least theoretically be performed by lesser-skilled individuals with nefarious intentions. Perhaps more significantly, the evolutionary leaps might be made naturally, via flu-infected birds, pigs, even humans. In other words, the research has implications for both terrorism and a catastrophic pandemic. Moreover, several experimental antecedents involving smallpox-like viruses and polio lend credence to the idea that concocting or radically altering viruses to create more lethal or transmissible germs is becoming an easier feat and an accidental byproduct of legitimate research.

Garrett puts her finger on another issue here: even if journals do decide to hold back on publishing the full studies, there is already work being done on such viruses in labs around the world. And these are not all the ultra-safe, BSL-4 labs that the Centers for Disease Control and Prevention has — the airlocked facilities seen in films like Contagion, where space-suited researchers work on deadly microbes like smallpox or Ebola. The work on the new H5N1 so far has been done in BSL-3 enhanced labs, which have high-efficiency air filters and which require scientists to shower and change clothes when leaving the lab — in other words, safe, but perhaps not safe enough. Worse, there’s little real oversight, with safety left largely to individual researchers.

As Declan Butler writes in Nature, the risk of a virus escaping from labs — even relatively safe ones — is far from zero:

Over the past decade, severe acute respiratory syndrome (SARS) has accidentally infected staff at four high-containment labs in mainland China, Taiwan and Singapore, variously rated as BSL-3 and BSL-4. A US National Research Council report released in September detailed 395 biosafety breaches during work with select agents in the United States between 2003 and 2009 — including seven laboratory-acquired infections — that risked accidental release of dangerous pathogens from high-containment labs. And the rapid spread of an escaped flu virus would make it more dangerous than other deadly pathogens. “When SARS or BSL-4 agents get out, their potential for transmission on a global basis is quite limited,” says Michael Osterholm, who heads the University of Minnesota’s Center for Infectious Disease Research and Policy in Minneapolis, and is a member of the NSABB. “Influenza presents a very difficult challenge because if it ever were to escape, it is one that would quickly go round the world.”

On the other hand, if work on the modified H5N1 were indeed restricted to BSL-4 facilities, it would surely slow the pace of research needed to develop vaccines and other countermeasures, simply because so few labs have that level of safety. And the more we withhold information about the modified H5N1, the less useful the work itself might be in helping the flu community prepare for a naturally transmissible avian flu. It’s a trade-off — as is the decision to publish or not publish research on the new H5N1.

Thanks to technology, science has become ever more decentralized, with obvious advantages. A broader band of researchers can produce deeper work, then share those studies with colleagues faster than ever before — which, in turn, encourages more discovery. In two recent infectious-disease emergencies — SARS in 2003 and H1N1/A in 2009 — teams of scientists around the world were able to collaborate in real time to discover the new viruses, and that speed almost certainly saved lives. But the decentralization brought about by technology brings risks as well. Biosafety is only as strong as its weakest link, and an accident — or an act by a single malicious person — could have catastrophic effects.