There had to be a sinking feeling in the chest of every researcher who works in a high-containment research laboratory last Friday when the U.S. Centers for Disease Control and Prevention (CDC) released its report on three worrisome incidents that raised safety questions at two well-respected government facilities. But it is likely the sensation was most acute for the influenza scientists who work in a controversial field known as gain-of-function research.



On Friday Director Tom Frieden revealed that someone in the CDC's influenza division had accidentally contaminated a vial of a relatively mild bird flu virus with the worst one known, H5N1. The Atlanta-based CDC then shipped the vial to unsuspecting researchers at the U.S. Department of Agriculture’s Southeast Poultry Research Laboratory up the road in Athens, Ga., who used its contents to infect some unlucky chickens.



There is no suggestion the unfortunate event was anything other than human error, and no one—except the chickens—was made ill as a result of the mistake. But the fact that it happened, and could happen again, has given valuable ammunition to a group of scientists who have been arguing for the past couple of years that gain-of-function work on influenza viruses is too dangerous to undertake.



Such studies take flu viruses found in nature and, in essence, try to make them more dangerous. The aim is to see what it would take for viruses like H5N1, which currently rarely infect people, to gain the ability to easily transmit to and among us. Coughs and sneezes propel human flu viruses through populations, and scientists have found that by adding mutations and passing viruses from ferret to ferret enough times, they can push bird viruses to spread that way among the animals, which often stand in for people in flu research.



The stated scientific aim for such experiments is to speed up the detection of naturally occurring viruses that might acquire these more dangerous skills in the wild. But the end result is the formation of nasty pathogens with the potential to trigger disastrous flu pandemics if they were ever to escape the confines of the labs. After all, in its wild form H5N1 kills about 60 percent of the people it infects.



Ron Fouchier, a Dutch virologist who is one of the biggest names in gain-of-function research, believes the CDC accident is going to make life more difficult for all those working on dangerous pathogens, not just those in the gain-of-function field. “When incidents like this happen, it’s going to be bad for all of us,” says Fouchier, who is based at the Erasmus Medical Center in Rotterdam.



These studies are done in so-called biosafety level (BSL) 3-enhanced labs, which have layers of safeguards in place to keep unauthorized people out and pathogens in as well as to ensure lab workers do not become infected in the course of their work. These precautions are there to protect the lab workers, obviously. But they are also there to protect the public by making sure that the researchers and technicians do not serve as unwitting carriers who wind up spreading these germs once they leave the lab. Indeed, the Erasmus scientists who work with Fouchier on H5N1 gain-of-function work are among a very few people on Earth to have been vaccinated against the bird flu virus, although that is not true of all flu scientists doing this type of research.



Fouchier insists this work should be done and can be done safely. “I would really doubt that these things would happen in my laboratory,” he says of the CDC incident. “But of course I understand that the director of CDC would have said the same thing.” Still, he knows critics of the field will hold up this incident as proof that even the best of laboratories are subject to human error, because the CDC flu lab is considered among the best in the world.



He is right—critics of the gain-of-function work will use this ammunition. In fact one of them virtually predicted the accident: Harvard School of Public Health professor Marc Lipsitch, who has become one of the leading voices in the debate against this research, recently wrote an op–ed in The New York Times warning of the possibility. The article was written last month, after the CDC reported an earlier lab accident that resulted in dozens of workers being exposed to deadly anthrax. “Unlike experiments with anthrax, creating such flu strains in the lab presents a danger that affects us all because once it is out, such a strain would be extremely hard to control. The researchers involved note that their labs are very safe, and they are. But ‘very safe’ does not mean the risk is zero,” Lipsitch wrote in the article.



It’s not just a theoretical risk. In 1977 a flu virus swept the world in an event that became known as the Russian flu. It was caused by a strain of flu that, later genetic tests showed, looked remarkably like those that had circulated in 1950. The belief is that the virus escaped from a laboratory or was used in a vaccine project that went awry. “It’s made me look really prophetic,” Lipsitch said in an interview about the CDC’s H5N1 incident, but he insists the prediction was an easy one to make. The CDC, which oversees laboratory safety in the U.S., reported in 2011 that lab errors involving select agents—the most dangerous pathogens—occur on average twice a week in U.S. labs. “It’s not hard to predict rain in England and it’s not hard to predict select agent mistakes in the U.S. It’s just been a bad week for government labs. Or a bad few weeks,” he said.



The “bad few weeks” remark refers to the fact that a trifecta of embarrassing mistakes made by premier U.S. health institutions has been revealed in the past month. First there was the anthrax event. Then long-forgotten vials of what turns out to be viable smallpox virus were found cached in a refrigerator in a U.S. Food and Drug Administration lab on the National Institutes of Health campus in Bethesda, Md. (Only two laboratories, the CDC and one at Novosibirsk, Russia, are authorized to store samples of the highly contagious virus, which was declared eradicated in 1980. All others were supposed to have been rounded up and destroyed decades ago.) Then came word of the H5N1 incident, which went unreported to CDC and presumably USDA leadership for several weeks. “The common feature of these three problems is that they involved human error. And no matter how many palm readers and ventilation systems and redundant whatever you have, you can’t stop human error,” Lipsitch says. “And so it’s directly relevant for the gain-of-function debate because the claim is these labs are somehow exceptionally safe. They may be exceptionally secure, but they’re not exceptionally safe if there are people working in them.”



Lipsitch and likeminded scientists have been having a hard time gaining traction for their arguments, but this incident may change that. There is already talk of more oversight for labs working on dangerous pathogens; on Friday CDC Director Frieden spoke of “coning down” on risky research, saying fewer labs should be working on bad bugs and those that do should use the least dangerous ways to do the work.



There is already some oversight of gain-of-function work—but only if it is done with U.S. government funding. Such work must be first approved by the NIH. Work done in private labs, or in other countries without U.S. funding, does not have to go through that hoop. (Chinese scientists are also doing flu gain-of-function research.) Moreover, a group that was asked to do prepublication scrutiny of two of the first gain-of-function studies, the National Science Advisory Board for Biosecurity (NSABB), has not met since 2012 when, under pressure, it backed off its insistence that the papers should be published in redacted form.



Michael Osterholm, director of the University of Minnesota’s Center for Infectious Diseases Research and Policy, was one of the NSABB members who felt scientists should not be publishing instructions of how to endow flu viruses with pandemic potential. Two years after that impasse, gain-of-function studies have been published at a steady clip. “One of the things that’s concerning here is that we continue to neglect the fact that by doing this work in some labs and publishing it, you then do empower many labs to potentially do the work they couldn’t do before,” Osterholm says. “And if we’ve got lab problems in our best labs, does anyone really expect that the other labs of the world are going to do any better in terms of lab safety?”



Osterholm believes in the wake of the CDC incidents Congress will move to restrict gain-of-function work, which, despite his earlier position, he actually thinks would be a bad idea. “When you start running your science by political vote, then you really are in trouble. And scientists have gotten themselves in this position.... We’ve had a certain hubris: ‘Leave us alone, we know what we’re doing.’ And look where that’s got us.”



Interestingly, Lipsitch shares Osterholm’s concern, saying he fears that efforts to rein in flu research will have an impact on other work as well. “Things like work on SARS…[which] is clearly more scientifically justified than work on invented flu strains—especially because there is no vaccine—could get curtailed.” Lipsitch thinks the answer could lie with the people who write the checks for research. “I think if funding dried up—even from one or two major funders—people would find other ways to spend their time.” Current funders include the U.S. government (through the NIH and other agencies), the European Union and the Chinese government.

Update (July 17, 2014): The NSABB is about to be revived. But the 11 remaining members (of a total of 23) who were on the board when it dealt with the gain of function controversy have been informed they are being replaced; several have noted publicly this effectively eliminates the NSABB's institutional memory on this key issue.

