Scientific publishers are backing an initiative to encourage authors of high-profile research papers to get their results replicated by independent labs. Validation studies will earn authors a certificate and a second publication, and will save other researchers from basing their work on faulty results.

The problem of irreproducible results has gained prominence in recent months. In March, a cancer researcher at Amgen pharmaceutical company, based in Thousand Oaks, California, reported that its scientists had repeated experiments in 53 'landmark' papers, but managed to confirm findings from only six of the studies1.

And last year, an internal survey at Bayer HealthCare, headquartered in Leverkusen, Germany, found that inconsistencies between published findings and the company’s own results caused delays or cancellations in about two-thirds of projects2.

Elizabeth Iorns, chief executive of Science Exchange — a commercial online portal that matches scientists with experimental service providers — noticed that a number of drug companies were employing researchers to validate published results. It prompted her to develop the Reproducibility Initiative, a mechanism to replicate research results, with a particular focus on preclinical biological studies.

“There has been a lot of negative press around the reproducibility problem,” says Iorns. “I think this is the first time that anyone has tried to do something positive.”

The Reproducibility Initiative will work through Science Exchange, which is based in Palo Alto, California. At first, Iorns says, authors will pay for validation studies themselves. However, she hopes that funding agencies will eventually support replication.

Blinded checks

Authors will submit studies to an advisory board for validation, which will select which experiments are crucial to the findings. These experiments will be replicated by experimental providers selected by the board; to keep checks independent, the authors of the original studies will not be told the identity of the providers, says Iorns.

Once the validation studies are complete, the original authors will have the option of publishing the results in PLoS ONE, linked to the original publication. Authors can also deposit primary data in the open-access repository Figshare.

Journal publishers including Nature Publishing Group in London and Rockefeller University Press in New York have expressed support for the initiative and would be happy to see papers they publish being validated.

Muin Khoury, who directs the US National Office of Public Health Genomics at the Centers for Disease Control and Prevention in Atlanta, Georgia, welcomes the project. "The Reproducibility Initiative will begin to address huge gaps in the first of many translational steps from scientific discoveries to improving health," he says.

John Ioannidis, an epidemiologist at Stanford University in California, is on the initiative's scientific advisory board. He expects only authors of high-profile papers to submit their work to extra scrutiny, and says that the project could help the scientific community to recognize experimental design flaws. “A pilot like this could tell us what we could do better,” he adds.

Getting confirmatory data into the public domain will help both industry and academia to advance exciting results, says Christopher Haskell, head of Bayer Healthcare's US innovation centre, the Science Hub in San Francisco, California. But he warns that the project will work best when people understand that science is rarely straightforward. Even if a validation study does not replicate the original findings, he says, that does not necessarily mean that the original paper is wrong. “It may be right, but just hard to reproduce.”

Iorns says that although the reproducibility rate is very low for published studies on average, she thinks that the ones voluntarily submitted to the initiative will do much better, with around 80% checking out.

So what if the results don't reproduce? "That's going to be very interesting," Iorns says. "We won't force anyone to publish [validation study] results."



