In both TV crime dramas and real-life courtrooms, fingerprints are often the lynchpin connecting a criminal to a crime. Many studies have demonstrated that the loops, whorls, and arches on an individual’s “friction ridge skin” are unique enough to be admissible as evidence, but few have investigated whether they remain the same over time. It turns out that fingerprints do evolve, but only slightly: A statistical analysis published today in the Proceedings of the National Academy of Sciences found that fingerprints change over time, but not enough to impact forensic analyses. The study followed 15,597 subjects, whose prints were taken at least five times over a minimum of 5 years. The results show that larger time intervals between printings reduced the odds of correctly matching a print to a finger in the database, but only by an operationally inconsequential amount. Further, the scenario in which an innocent defendant would be wrongfully convicted—where the machine finds a match even though there isn’t one—was even less likely, with a probability close to zero regardless of the time between printings. Overall, the best predictor of mistakes was the quality of the image. Poor images yielded more errors, leading the team to conclude that image quality plays a bigger role in explaining the variation than elapsed time.