I don’t recall this ever happening before. It probably has, but if so, it’s rare enough that I have never heard about it. The strange odyssey of the paper, Long term toxicity of a Roundup herbicide and a Roundup-tolerant genetically modified maize, by Seralini et. al., just got stranger.

The paper was published in 2012 in Food and Chemical Toxicity. It was greeted with intense criticism from the scientific community for its many shortcomings, culminating in the journal retracting the study. Now the study has been republished by a new open access journal, Environmental Sciences Europe.

The GMOSeralini website is celebrating the republication, writing:

“Now the study has passed a third peer review arranged by the journal that is republishing the study, Environmental Sciences Europe.”

They also claim the the article was retracted because of pressure from pro-GMO lobbying.

Nature, however, reports that the journal’s editor-in-chief, Henner Hollert, communicated to them:

“We were Springer Publishing’s first open access journal on the environment, and are a platform for discussion on science and regulation at a European and regional level.” ESEU conducted no scientific peer review, he adds, “because this had already been conducted by Food and Chemical Toxicology, and had concluded there had been no fraud nor misrepresentation.” The role of the three reviewers hired by ESEU was to check that there had been no change in the scientific content of the paper, Hollert adds.”

So the paper was not re-peer-reviewed, despite Seralini’s claim. It was just republished, with the addition of more raw data and commentary by Seralini.

I blogged about the paper when it was first published. The paper claims that exposure to GM corn resistant to Roundup, or to Roundup itself, increased tumors in a rat model. Seralini and his coauthors got off to a terrible start by sending a press release to reporters but not allowing them to seek outside comment from independent experts. The paper itself was quickly shredded by expert review. Here is a summary of the criticism:

– The population of rats used have a high propensity for tumors. This causes a great deal of background noise, and would likely favor a false positive result.

– There were only 20 rats in the control group, and 80 in the exposure groups, an atypical asymmetry.

– The data reports that “some” of the test groups had a higher tumor incidence, while others did not – sounds suspiciously like cherry picking the data.

– The statistical analysis done by the team was atypical, characterized by nutrition researcher Tom Sanders as ”a statistical fishing trip,” while a more standard analysis was excluded.

– Exposure to GM corn or the herbicide Roundup had the same negative effects. It is inherently implausible (admittedly not impossible) for such distinct mechanisms to have the same effect.

– There was no dose response at all – which is a critical component of demonstrating a toxic effect.

– The researchers did not control for total amount of food consumed, or fungal contaminants, both of which increase tumors in this population of rat.

A careful review of the pathology presented in the study was published as a comment, and concluded:

The ESTP comes to the conclusion that the pathology data presented in this paper are questionable and not correctly interpreted and displayed because they don’t concur with the established protocols for interpreting rodent carcinogenicity studies and their relevance for human risk assessment. The pathology description and conclusion of this study are unprofessional. There are misinterpretations of tumors and related biological processes, misuse of diagnostic terminology; pictures are not informative and presented changes do not correspond to the narrative. We would like to finish our commentary with a question: what is the scientific rationale that led the journal reviewers and the editorial board of Food and Chemical Toxicology to accept this article for publication?

It was stinging commentary such as this that likely led to the paper’s retraction. My own summary would be – this was a small study with sloppy methods using a noisy system and reeking of p-hacking.

Of course, this one small study is getting so much attention because of the political nature of the question – the safety of GMO. Some in the anti-GMO crowd still use the study to support their position, and they are likely happy with the republication.

Retraction Watch discussed this issue, giving a good quote from a journalist, Oransky:

“The ratio of politics to science when it comes to discussions of GMOs [genetically modified organisms] is so high that I think it often ceases to be useful. “This is a good example of what happens when people with hardened beliefs manipulate a system for the result they want. Science should be about following the evidence, appropriately changing your mind if the evidence warrants it. But in this case people seem to reject the evidence that doesn’t suit their needs.”

Conclusion

The Seralini GMO rat study is now infamous for its poor quality and overstated conclusions. The republication of the paper extends the saga, but does nothing to correct the many failings of the study.

Defenders of the republication cite the virtue of access to data, and open communication, while decrying the “censorship” of the retraction. This completely misses the point, however. The peer-reviewed literature is not just about open communication of data, but puts into place a heavy filter for quality. A study needs to reach a certain minimum level of quality before it is worthy of consideration as part of the peer-reviewed literature.

If anything, the peer-review process (especially if you consider it across all such journals) should be tightened, not loosened.

There is a further problem with publishing preliminary or exploratory research. Such studies are meant only as an indicator of future confirmatory research, not as a basis of conclusions or recommendations. However, preliminary research is often treated by the press, and therefore the public (and often encouraged by authors overstating their data) as if it were confirmatory.

This problem is exacerbated when the topic is controversial, like GMO. I would argue that the threshold for publication should be higher for controversial topics, otherwise unreliable data is likely to confuse the public discourse.

Still, scientists need preliminary data to guide later research. The compromise I have suggested is that preliminary research be published with an editorial warning label – this is preliminary research meant only for professionals to guide later research and should not be used as a basis for recommendations, policy, or scientific conclusions. Publishing this study is not a statement by the editors that the results are likely to be true.