One shocking result from the new N.I.S.T. study is that labs analyzing the same evidence calculated vastly different statistics. Among the 108 crime labs in the study, the match statistics varied over 100 trillion-fold. That’s like the difference between soda change and the United States’ gross domestic product. These statistics are important because they are used by juries to consider whether a DNA match is just coincidence.

I first learned about the results of this study in 2014, at a talk by one of its authors. It was clear that crime labs were making mistakes, and I expected the results to be published quickly. Peer-reviewed publication is important, because most judges won’t let you cite someone’s PowerPoint slide in your testimony.

But years went by before the study was published, preventing lawyers from using the findings in court, and academics from citing the results in journal articles. If some of us had not complained publicly, it may not ever have been published.

While this lapse in publication is troubling, more disturbing is that the authors try to mute the impact of their own excellent work. Neither the paper’s title nor the abstract mention the shocking findings. And the paper contains an amazing number of disclaimers.

In fact, the conclusion begins with a stark disclaimer apparently intended to block courtroom use:

The results described in this article provide only a brief snapshot of DNA mixture interpretation as practiced by participating laboratories in 2005 and 2013. Any overall performance assessment is limited to participating laboratories addressing specific questions with provided data based on their knowledge at the time. Given the adversarial nature of the legal system, and the possibility that some might attempt to misuse this article in legal arguments, we wish to emphasize that variation observed in DNA mixture interpretation cannot support any broad claims about “poor performance” across all laboratories involving all DNA mixtures examined in the past.

People serving time behind bars based on shoddy DNA methods may disagree. It is uncomfortable to read the study’s authors praising labs for their careful work when they get things right, but offering sophomoric excuses for them when they get things wrong. Scientists in crime labs need clear feedback to change entrenched, error-prone methods, and they should be strongly encouraged to re-examine old cases where such methods were used.

The good news is that there are methods to reanalyze old DNA mixture data using computer programs that can help analysts correct errors, without any new lab testing. In fact, one lesson from the study is that while only seven of the 108 labs in the study properly excluded the innocent profile, one of them used such a program (TrueAllele by Cybergenetics). Many crime labs now have access to these programs and use them on current cases. But they could and should easily go back and re-examine old DNA mixtures to correct tragic mistakes.

In fact, we have shown that this is possible. Working with Cybergenetics analysts and Innocence Network organizations in four states, our Boise State University laboratory has re-examined a few select cases and already persuaded courts to overturn a conviction in New Mexico, two in Indiana and two in Montana. We have also helped identify a new suspect in a 23-year-old murder.