It can be very challenging for the public to evaluate evidence when faced with two ostensible experts that stake out opposing positions on a topic. Since a 1993 ruling by the Supreme court in the case of Daubert v. Merrell Dow Pharmaceuticals, that burden was largely lifted from those who were serving on juries; instead, judges are now expected to act as gatekeepers, determining what evidence is scientific, and keeping the rest out of the courtroom. The US National Academies of Science has played a role in getting the judiciary up to snuff on scientific knowledge, but two new reports from the group highlight the challenges of putting the legal system on a firm scientific foundation.

The first is a progress report on revisions to a document, called the Reference Manual on Scientific Evidence, which is used for training the judiciary on scientific matters. The report highlights some areas where judges are still having trouble with coming to terms with scientific approaches and terminology, especially when it comes to statistics. The urgent need for this is highlighted by a draft report on the use of forensic evidence in the US, which was leaked to The New York Times. That draft suggests that many of the forensic "experts" that are testifying in criminal cases don't know statistics either, and have no sense of the potential problems with their techniques.

We'll do the good news first. The judiciary is rapidly coming to grips with scientific evidence. The Reference Manual on Scientific Evidence was produced by the NAS in the wake of Daubert, and plans are underway for its third edition. The progress report highlights that the second edition is now used in law schools and a variety of judicial training programs, and is widely regarded as essential for a number of these. The fact that judges are relying on it has forced many lawyers to adopt its use, and over 100,000 copies have now been sold.

That said, the committee has now identified a number of problems with the current edition. Chief among these is the use of statistics, where judges apparently have trouble following the basic terminology used by the field, or how it is used in specialties like epidemiology. A section on engineering issues, which was added in the wake of subsequent court rulings, was apparently considered superficial; various judges recommend that it either be significantly expanded, or dropped entirely. In the wake of the feedback, the report recommends largely retaining the current edition's structure and focus, but to add material on some of the areas of science that are showing up in the courtroom with greater regularity: forensic sciences, genetics, pharmacology, and computer science.

The need to get judges up to speed on forensics was highlighted by the story that appeared in The Times, which obtained a copy on of a report on the filed, also prepared by the NAS, that will be released later this month. The report is called a "sweeping critique" of nearly all aspects of forensic work on the US, which suffers from a lack of standards for both training and practice. Due to a lack of national standards, many of those practicing forensics have little background in science, statistics, or empirical practices, and, as a result, rely on techniques that are base on out-of-date information and untested principles. Because they don't know any better, these same individuals testify as forensics experts in court, and testify about their methods with an exaggerated sense of their robustness.

The Times quotes one expert as saying an arm of the Justice Department actually tried to block the review from going forward, and only funding by Congress ensured its completion.

There have been any number of high-profile cases where individuals were exonerated after forensic evidence, from blood typing to fingerprinting, was used for arrest and/or conviction, and then found to be less than compelling after a reexamination. Although there are some highly qualified individuals working in the field, the news makes it clear that a fair number of practitioners operate at a level well below what their job responsibilities would seem to require. And, based on the NAS review, it appears that a lack of statistical training has left the judiciary in no shape to enforce any sort of robust standards.