A study has tallied up the costs to a major US research funder of misconduct that lead to retractions, and the price scientists involved paid for their dishonesty. There are significant indirect costs not covered by this figure, however, and the system itself may be encouraging misconduct, according to the senior author.

Papers retracted due to misconduct accounted for approximately $58 million (£35 million) in direct funding by the US National Institutes of Health (NIH) between 1992 and 2012, far less than 1% of the total budget over this time, the study found. The impact on those sanctioned was also tallied, using information available from the Office of Research Integrity (ORI). This office has authority over research supported by the Department of Health and Human Services in the US.

Direct and indirect costs

Each journal article retracted cost an average of $392,582 in direct costs. Ferric Fang, senior author and clinical microbiologist at the University of Washington, US, says that the retractions were primarily for data fabrication and data falsification. He admits that he expected the total cost of misconduct to be larger.

Perhaps the biggest cost is to the credibility of science

Even multiplying the total cost of misconduct 100-fold would only account for 1–2% of the NIH budget over the last 20 years. ‘We conclude that the direct fiscal cost of fraudulent research is relatively small, as far as fraudulent use of government funds is concerned,’ says Fang.

That is not to downplay the impact of misconduct. ‘The major costs cannot be measured in this way, costs you cannot put a dollar sign on,’ Fang emphasises. This includes bad decisions made due to fraudulent research in the literature, other scientists wasting time trying to build on falsified findings and public policy being misdirected. ‘But perhaps the biggest cost is to the credibility of science, as it can undermine the public support for research and that could be hazardous for society,’ Fang concludes.

‘Articles such as this underscore our need to be vigilant,’ comments Arthur Michalek, an epidemiologist at the University of Buffalo, US, who has looked at consequences of research misconduct. ‘We can also rest somewhat easy in that the vast majority of research is conducted in the utmost ethical manner.’

Bias in science

Daniele Fanelli, visiting professor at the University of Montreal, Canada, who recently looked at how many scientists falsify data, sees another important message in the findings. On the wave of enthusiasm in finally dealing with this issue, we risk being over enthusiastic and focusing excessively on dishonesty in science, which is probably not the biggest source of distortion and false results, he says.

‘The risk is you put all your efforts in trying to make scientists more honest or punish dishonest scientists, but that is not going to address the largest sources of bias in science,’ says Fanelli, referring to a culture of playing the system and trying to squeeze statistical significance out of data.

‘There are much more subtle forms of bias, which escape the awareness of scientists themselves, and that is where a lot of false positives come from,’ he adds. He believes an investment here would save more in funds, resources and human and animal suffering than attempting to catch dishonest scientists.

The study also totted up the number of publications per year for senior authors before and after a finding of misconduct. Researchers suffered a median 91.8% fall in publication output. Of 44 authors with at least one publication in the three years before an ORI finding, 24 had not published anything three years later. Funding for these scientists subject to an ORI misconduct finding fell from around $23 million for the five years before the report, to less than $7 million given to the same people over the next five years.

‘In the majority of cases a finding of fraud resulted in the end of the academic career, particularly at the senior level,’ says Fang. There were exceptions where authors continued to publish as much or more, suggesting a finding of misconduct needn’t necessarily end a research career. However, this was unusual and most of these cases relate to ethical violations other than data fabrication or falsification.

Big picture

Fang says misconduct is on the rise, but whether this is due to increasing dishonesty or better detection is hard to say. Personally he believes it is both. ‘I think more people can expect that if they commit misconduct they are going to be caught.’

He points the finger of blame at the system. Research on how to foster creativity suggests you need to give people an environment where they have the security to pursue ideas and go in different directions, even though some are going to be low yield, he says. This characterised science in the early post-war period when money was plentiful, the pressure to publish was not as great and the scientific community was relatively small. Young scientists had a reasonable chance of forging a research career too. The situation today is dramatically different, Fang notes.

‘The number of unsuccessful grants has gone up several hundred per cent and the imbalance between available positions and available trainees continues to worsen,’ says Fang. ‘The hyper-competition means people worry about having a career in science. This is a very unhealthy situation if you want the output to be robust, reliable, carefully conducted science that you are going to be able to build future work on. It is a recipe for disaster,’ he warns.