Late one afternoon in the spring of 1998, a police detective named Shirley McKie stood by the sea on the southern coast of Scotland and thought about ending her life. A promising young officer, the thirty-five-year-old McKie had become an outcast among her colleagues in the tiny hamlet of Strathclyde. A year earlier, she had been assigned to a murder case in which an old woman was stabbed through the right eye with a pair of sewing scissors. Within hours of the killing, a team of forensic specialists had begun working their way through the victim’s house. Along with blood, hair, and fibres, the detectives found some unexpected evidence: one of the prints lifted from the room where the murder took place apparently matched the left thumb of Detective McKie.

Crime scenes are often contaminated by fingerprints belonging to police officers, and investigators quickly learn to eliminate them from the pool of suspects. But McKie said that she had never entered the house. Four experts from the Scottish Criminal Record Office—the agency that stores and identifies fingerprints for Scotland’s police—insisted, however, that the print was hers. Though McKie held to her story, even her father doubted her. “I love my daughter very much,’’ Iain McKie, who served as a police officer in Scotland for more than thirty years, told me earlier this year. “But when they said the print was Shirley’s I have to admit I assumed the worst. My entire career I had heard that fingerprints never lie.”

Nobody actually suspected McKie of murder, and in fact the victim’s handyman, David Asbury, was charged with the crime. The sole physical evidence against him consisted of two fingerprints—one of his, lifted from an unopened Christmas gift inside the house, and one of the victim’s, found on a biscuit tin in Asbury’s home. The last thing prosecutors needed was for their own witness to raise questions in court about the quality of the evidence. Yet McKie did just that—repeating under oath that she had never entered the house. Asbury was convicted anyway, but Scottish prosecutors were enraged by McKie’s testimony. As far as they were concerned, McKie had not only lied; she had challenged one of the evidentiary pillars of the entire legal system. Despite their victory in the murder trial, they charged McKie with perjury.

Desperate, she went to the public library and searched the Internet for somebody who might help her. Among the names she came upon was that of Allan Bayle, a senior forensic official at New Scotland Yard and perhaps the United Kingdom’s foremost fingerprint expert. (It was Bayle’s expertise and supporting evidence that helped convict one of the principal Libyan suspects in the 1988 bombing of Pan Am Flight 103, over Lockerbie, Scotland.) He agreed to review the prints, and what he saw astonished him. “It was obvious the fingerprint was not Shirley’s,’’ Bayle told me recently. “It wasn’t even a close call. She was identified on the left thumb, but that’s not the hand the print was from. It’s the right forefinger. But how can you admit you are wrong about Shirley’s print without opening yourself to doubt about the murder suspect, too?” Bayle posted a comment on Onin.com, a Web site trafficked regularly by the world’s fingerprint community. “I have looked at the McKie case,’’ he wrote. “The mark is not identical. I have shown this mark to many experts in the UK and they have come to the same conclusions.”

Bayle’s assertion caused a furor. He was threatened with disciplinary action, shunned by his colleagues, and, after a quarter century with the Metropolitan Police, driven from his job. But in the end McKie was acquitted, and Bayle’s statement helped challenge a system that had, until then, simply been taken for granted.

For more than a century, the fingerprint has been regarded as an unassailable symbol of truth, particularly in the courtroom. When a trained expert tells a judge and jury that prints found at a crime scene match those of the accused, his testimony often decides the case. The Federal Bureau of Investigation’s basic text on the subject is entitled “The Science of Fingerprints,’’ and a science is what F.B.I. officials believe fingerprinting to be; their Web site states that “fingerprints offer an infallible means of personal identification.’’ The Bureau maintains a database that includes the fingerprints of more than forty-three million Americans; it can be searched from precinct houses and properly equipped police cruisers across the country. Fingerprints are regularly used to resolve disputes, prevent forgery, and certify the remains of the dead; they have helped send countless people to prison. Until this year, fingerprint evidence had never successfully been challenged in any American courtroom.

Then, on January 7th, U.S. District Court Judge Louis H. Pollak—a former dean of the law schools at Yale and at the University of Pennsylvania—issued a ruling that limited the use of fingerprint evidence in a drug-related murder case now under way in Philadelphia. He decided that there were not enough data showing that methods used by fingerprint analysts would pass the tests of scientific rigor required by the Supreme Court, and noted the “alarmingly high” error rates on periodic proficiency exams. Although Judge Pollak later decided to permit F.B.I. fingerprint experts to testify in this particular case, students of forensic science felt his skepticism was justified. “We have seen forensic disciplines which focus on bite marks, hair analysis, and handwriting increasingly questioned in the courts,” Robert Epstein, who had argued for the exclusion of fingerprint testimony in the case, told me. “But we have accepted fingerprinting uncritically for a hundred years.’’

Epstein, an assistant federal public defender in Philadelphia, was responsible for the first major court challenge to the discipline, in 1999, in U.S. v. Byron Mitchell. In that case, Epstein showed that standards for examiners vary widely, and that errors on proficiency tests—which are given irregularly and in a variety of forms—are far from rare. The critical evidence consisted of two fingerprint marks lifted from a car used in a robbery. To prepare for the trial, F.B.I. officials had sent the prints to agencies in all fifty states; roughly twenty per cent failed to identify them correctly. “After all this time, we still have no idea how well fingerprinting really works,’’ Epstein said. “The F.B.I. calls it a science. By what definition is it a science? Where are the data? Where are the studies? We know that fingerprint examiners are not always right. But are they usually right or are they sometimes right? That, I am afraid, we don’t know. Are there a few people in prison who shouldn’t be? Are there many? Nobody has ever bothered to try and find out. Look closely at the great discipline of fingerprinting. It’s not only not a science—it should not even be admitted as evidence in an American court of law.”

Fingerprints have been a source of fascination for thousands of years. They were used as seals on legal contracts in ancient Babylonia, and have been found embossed on six-thousand-year-old Chinese earthenware and pressed onto walls in the tomb of Tutankhamun. Hundreds of years ago, the outline of a hand with etchings representing the ridge patterns on fingertips was scratched into slate rock beside Kejimkujik Lake, in Nova Scotia.

For most of human history, using fingerprints to establish a person’s identity was unnecessary. Until the nineteenth century, people rarely left the villages in which they were born, and it was possible to live for years without setting eyes on a stranger. With the rise of the Industrial Revolution, cities throughout Europe and America filled with migrants whose names and backgrounds could not be easily verified by employers or landlords. As the sociologist Simon Cole made clear in “Suspect Identities,” a recent history of fingerprinting, felons quickly learned to lie about their names, and the soaring rate of urban crime forced police to search for a more exacting way to determine and keep track of identities. The first such system was devised in 1883 by a Parisian police clerk named Alphonse Bertillon. His method, called anthropometry, relied on an elaborate set of anatomical measurements—such as head size, length of the left middle finger, face height—and features like scars and hair and eye color to distinguish one person from another. Anthropometry proved useful, but fingerprinting, which was then coming into use in Britain, held more promise. By the eighteen-sixties, Sir William J. Herschel, a British civil servant in India, had begun to keep records of fingerprints and use them to resolve common contract disputes and petty frauds.

Fingerprinting did not become indispensable, however, until 1869, when Britain stopped exiling criminals to Australia, and Parliament passed the Habitual Criminals Act. This law required judges to take past offenses into account when determining the severity of a sentence. But in order to include prior offenses in an evaluation one would need to know whether the convict had a previous record, and many criminals simply used a different alias each time they were arrested. The discovery that no two people had exactly the same pattern of ridge characteristics on their fingertips seemed to offer a solution. In 1880, Dr. Henry Faulds published the first comments, in the scientific journal Nature, on the use of fingerprints to solve crimes. Soon afterward, Charles Darwin’s misanthropic cousin, Sir Francis Galton, an anthropologist and the founder of eugenics, designed a system of numbering the ridges on the tips of fingers—now known as Galton points—which is still in use throughout the world. (Ultimately, though, he saw fingerprints as a way to classify people by race.)

Nobody is sure exactly how Mark Twain learned about fingerprints, but his novel “Pudd’nhead Wilson,” published in 1894, planted them in the American imagination. The main character in the book, a lawyer, earned the nickname Pudd’nhead in part because he spent so much time collecting “finger-marks”—which was regarded as proof of his foolishness until he astounded his fellow-citizens by using the marks to solve a murder. If you were to walk into a courtroom today and listen to the testimony of a typical forensic expert, you might hear a recitation much like Pudd’nhead Wilson’s:

Every human being carries with him from his cradle to his grave certain physical marks which do not change their character, and by which he can always be identified—and that without shade of doubt or question. These marks are his signature, his physiological autograph, so to speak, and this autograph cannot be counterfeited, nor can he disguise it or hide it away, nor can it become illegible by the wear and the mutations of time. . . . This signature is each man’s very own. There is no duplicate of it among the swarming populations of the globe!

Some things have changed since Pudd’nhead Wilson, of course. A few weeks ago, I visited the headquarters of the Integrated Automated Fingerprint Identification Systems, the F.B.I.’s billion-dollar data center, just outside Clarksburg, West Virginia—a citadel of the American forensic community. After driving past a series of shacks and double-wides and Bob Evans restaurants, you come upon a forest with a vast, futuristic complex looming above the trees. (I.A.F.I.S. moved from more crowded quarters in the Hoover Building in 1995, thanks to the influence of the state’s senior senator, Robert C. Byrd.)

Clarksburg is home to the world’s largest collection of fingerprints; on an average day, forty thousand are fed into the system. The I.A.F.I.S. computers, which can process three thousand searches a second, sort through the database in a variety of ways. For example, they compare complete sets of fingerprints in the files with new arrivals—as when a suspect is held in custody and the police send his “ten-prints” to I.A.F.I.S. The computer hunts for shared characteristics, and then attempts to match the prints to a record on file. “We identify about eight thousand fugitives per month here,’’ Billy P. Martin, the acting chief of the Identification and Investigative Services Section, told me. Martin said that eleven per cent of job applicants whose fingerprints are entered into the system—they could be day-care workers, casino staff, federal employees—turn out to have criminal records; as many as sixty per cent of the matches are repeat offenders.

The center looks like a NASA control room, with dozens of people monitoring the encrypted network of fingerprint machines sending in data from police stations throughout the country. The main computer floor is the size of two football fields and contains sixty-two purple-and-gray “jukeboxes,” each filled with two hundred compact disks containing fingerprints. (There are three thousand sets on each CD.) When someone is arrested, his prints are initially searched against a state’s computer files. If the search finds nothing, the information is forwarded to the federal database in Clarksburg. To make a match, the I.A.F.I.S. computer analyzes the many points on the ridges of every fingerprint it receives, starting with the thumb and working toward the pinkie; only when the data produce prints that match (or several prints that seem similar) is the original print forwarded to an analyst for comparison.

”We used to go to a file cabinet, pull out paper cards. If it was all loops—which is the most common type of print—you could spend an hour,’’ Martin said. “Now a computer algorithm does it in seconds. The system searches the electronic image against the database and pulls up the image onto the screen. The accuracy rate on first run is 99.97 per cent.’’ Still, this would mean that the I.A.F.I.S. computers make three hundred mistakes in every million searches. That is where trained examiners come in. The patterns on fingertips are more like topographical maps or handwriting than, say, bar codes. They can be so similar that even the most sophisticated computer program can’t tell them apart; it takes a trained human eye to detect the subtle differences.