I. In 1997, the UK Department of Health launched a study to determine whether a popular cardiovascular drug, atorvastatin, could reduce the number of heart attacks and strokes in diabetic patients. The trial, known as the Collaborative Atorvastatin Diabetes Study (Cards), took seven years to complete. Money had to be raised, doctors had to be recruited, and then 2,838 patients had to be monitored weekly. Half of the diabetics were given the drug. The other half received a placebo.

In early 2004, a few months before the results of the trial were released, the American Diabetes Association asked a physician and mathematician named David Eddy to run his own Cards trial. He would do it, though, without human test subjects, instead using a computer model he had designed called Archimedes. The program was a kind of SimHealth: a vast compendium of medical knowledge drawn from epidemiological data, clinical trials, and physician interviews, which Eddy had laboriously translated into differential equations over the past decade. Those equations, Eddy hoped, would successfully reproduce the complex workings of human biology — down to the individual chambers of a simulated person's virtual heart.

Because the results of the real Cards trial were still secret, Eddy knew only the broadest facts about its participants, such as their average age and blood pressure. So Eddy and his team created a simulated population with the same overall parameters. Each person "developed" medical problems as they aged, all dictated by the model's equations and the individual risk profiles. These doubles behaved just like people: Some, for example, forgot to take their pills every once in a while.

It took Eddy and his team roughly two months to construct the virtual trial, but once they hit Return, the program completed the study in just one hour. When he got the results, Eddy sent them to the ADA. He also mailed a copy to the Cards investigators. Months later, when the official results were made public, it became clear that Eddy had come remarkably close to predicting exactly how everything would turn out. Of the four principal findings of the study, Archimedes had predicted two exactly right, a third within the margin of error, and the fourth just below that. Rather than seven years, Eddy's experiment had taken just a couple of months. And the whole project had cost just a few hundred thousand dollars, which Eddy estimates to be a 200th of the cost of the real trial. The results seemed to vindicate his vision for the future of medicine: faster, cheaper, broader clinical trials — all happening inside a machine.

Eddy had built his model following a revelation. He had spent long years promoting clinical trials as the "gold standard" of medical research. But although it is better for doctors to make decisions based on clinical trial data than on instinct, trials are expensive and time-consuming. They're also constrained by the requirement that medications be tested on carefully selected, highly specific populations (for example, diabetics who are overweight but not obese, with no past history of heart disease). Because of this, Eddy realized, trials could never fully meet the needs of policymakers who have to make sweeping decisions about patient care. Ideally, before recommending treatment protocols, you'd like to test different combinations of therapies in a variety of patients. And so Eddy hit on the idea of a soup-to-nuts model that would capture everything known by modern medicine, from the evolution of disease in different people — as shaped by factors like race, genetic risk, and number of hours spent doing yoga — to specific physiological details, such as the amount of heart muscle that dies in the hours after a heart attack and the degree to which medications like aspirin can limit that damage. Tests could be run in hours instead of years, and the model could be constantly updated with the latest research. It would spit out data on both health outcomes and finances, calculating the costs and benefits of particular treatments to the penny, taking into account data ranging from emergency room visits and blood tests to the savings generated by therapies that kept a patient's illness from getting worse.

And because the mathematics captured the complex interactions of human physiology, Eddy argued, they allowed Archimedes to reproduce even aspects of disease that weren't readily measurable — like the amount of arterial plaque a person had accumulated by a certain age. In other words, Eddy believed that his simulation had a strange and remarkable advantage: the ability to see even deeper into human physiology than actual researchers could.

This massively ambitious project, originally funded by Kaiser Permanente, now makes up the core of a private company, also called Archimedes, that Eddy operates from the 29th floor of a San Francisco office tower. According to Mark Roberts, chief of the Section of Decision Sciences and head of the Clinical Systems Modeling program at the University of Pittsburgh, Eddy has built a system that can replicate the intricate web of human physiology and disease with amazing fidelity. "I've spent probably 25 years of my academic life trying to understand how to make these kinds of decision models be more clinically realistic," Roberts says. "But David is so far ahead of anyone else in the field — it really is amazing." Eddy has described his program as "Einstein meets Hippocrates."

But others maintain that Eddy may just have built the world's most expensive ant farm: a self-contained system of great complexity that has little bearing on reality.

The problem is not that every model is, by definition, an approximation. At the heart of the debate is whether Eddy's approximation is accurate — whether it truly represents the human body. Marc Lipsitch, an epidemiologist at Harvard, isn't convinced, and he frets that Eddy may be using math to define connections that even medicine doesn't understand. "I'm not aware of anybody who knows the exact quantitative relationship between diet and blood glucose and diabetes," Lipsitch says.

"Clinicians tend not to trust models," Eddy says. "Which is understandable, since many models are junk and it can be difficult to tell which is which."

II. David Eddy's obsession with calculation began early. As a resident in cardiac surgery at Stanford in the 1970s, he worked briefly on an early artificial heart, which was then being tested in dogs. Intrigued by the heart's timing mechanism, Eddy taught himself calculus, and within a year he had abandoned his residency to enroll in the school's Engineering-Economic Systems Department. When he graduated several years later, Eddy won the Lanchester Prize in operations research for his dissertation. Around the same time he was asked by the American Cancer Society to help write new national guidelines for cancer screening. When his work was published in 1980, it made headlines. He recommended that women be screened for cervical cancer every three years rather than annually, which would save about $1 billion in medical costs.

A rising star, Eddy soon moved to Duke University, where he ran a center on health policy and began tipping more of medicine's sacred cows. The template for Eddy's approach was empirical evidence, primarily the clinical trial, in which every variable is held constant except for those few under study. By subjecting a treatment to the rigor of a double-blind, years-long investigation, Eddy believed, any assumption could be tested, any canard uncovered, and any good practice replaced with a better one — a conviction that he pushed in academic articles.

But administration wore on Eddy, and after five years in North Carolina the work had grown stale. "I was getting old, fat, and burned out" he says. "At first I thought, 'Well, that's age; you've just got to learn to adjust.' Then one day I'm at the airport, and I see a poster in an ice cream shop of the Grand Teton. And I say, 'Damn it, I'm not going to adjust!' So I left Duke, built a house in Wyoming with a straight shot at the Grand, and started mountain climbing. I decided I was too young to die."

Collage: Dan Winters

After settling in Wyoming in 1987, Eddy began working as a consultant for Kaiser Permanente — at the time a midsize insurer and health care provider. He also continued a consulting gig he had with Blue Cross Blue Shield, helping the insurer set up a program to determine when the evidence of a treatment's effectiveness was sufficient to justify coverage. The work often pitted Eddy against doctors who believed he was interfering with their ability to treat patients. In one particularly contentious case, Eddy argued against covering a complicated, potentially dangerous treatment involving high-dose chemotherapy followed by a bone marrow transplant that was believed to help breast cancer patients. It cost up to $150,000, and there was no evidence that the transplant actually made patients live longer, so Blue Cross decided not to cover the treatment. But Eddy's argument didn't fly with oncologists and advocacy groups, who pilloried him for denying patients access to a potentially lifesaving treatment and accused Blue Cross of sacrificing women's health for its bottom line.

The dispute ignited the national media. "The country was in a furor," Eddy says. "There was hate mail and that kind of thing." A cover story in Time magazine included a photo of a doctor gagged with a surgical mask.

Several years later, clinical trials confirmed that the high-dose chemotherapy and bone marrow transplant had no effect on patient survival. "We held the line and stopped a dangerous treatment," he says. "The first rule is 'Thou shalt have evidence.'"

That victory and others reinforced Eddy's confidence and cemented his status as a champion of scientific data and evidence over medical judgment. At Stanford, he'd been appalled to discover that doctors still relied largely on anecdotal studies and conventional wisdom in determining treatment. "I saw women who had terrible edema — a painful swelling caused by water retention — in their arms because their lymph nodes had been taken out during a radical mastectomy," he says. "So I asked the surgeons, 'Why do we have to take out the lymph nodes?' And they all said, 'You just have to.' So then I started searching the medical literature, trying to find out what studies had been done showing that more women survived if their lymph nodes were removed. And there weren't any!"

Worse, Eddy found, many of the doctors' assumptions were wrong. In 1989, Eddy published a comment in the Annals of Internal Medicine suggesting that annual chest x-rays, widely used for two decades as a strategy for detecting lung cancer, did nothing to save lives — even screening smokers did not reduce lung cancer mortality. Over time, Eddy codified his insistence on data over tradition as the practice of "evidence-based medicine" and outlined the practical guidelines required for choosing one treatment over another. Twenty years later, evidence-based medicine has become the norm in medical schools and hospitals. (Even Medicare and Medicaid now insist that evidence accompany any new procedure or drug before they'll authorize payment.) "He really revolutionized medical practice," says Richard Kahn, former chief scientific and medical officer of the American Diabetes Association. "Before David forced doctors to examine their decisions, a lot of sloppy stuff was going on."

But even as Eddy was lobbying to expand the mandate for clinical trials, he came to realize their limitations. Besides being expensive and slow, clinical trials were inherently unable to explore the many possible variations in treatment — like whether a patient on one diabetes drug would benefit from a second, or how their prognosis would change if they exercised for an hour a day instead of half an hour. Theoretically, a mathematical model would enable researchers to answer those questions.

In practice, building Archimedes created a whole new set of challenges. What Eddy proposed to do was codify medicine, breaking the multitude of choices, trade-offs, and increments of clinical practice into a discrete and neat set of data. Doing that, however, required a massive amount of firsthand information-gathering. "We would go to the emergency room again and again," says Len Schlessinger, a physicist who helped build Archimedes, "trying to figure out how a doctor decides whether to treat a patient with chest pains or send them home, what tests they order — all that. We probably have an inch-thick stack of flowcharts describing the heart attack procedure alone."

Untangling and re-creating the elaborate knot of rules and referrals that drove the health care system was particularly grueling. The model currently includes more than 100 kinds of medical appointments — simple check-ups, complex visits, follow-ups — each with a different cost. Then there was the confusing fact that patients often evaluate their symptoms differently. Though most people begin to experience chest pains when their coronary arteries are 70 percent blocked, some will feel pain at 50 percent while others reach 90 percent before noticing a problem. To plot that distribution, Eddy and Schlessinger looked at published studies of angiograms that revealed a patient's actual level of arterial blockage when chest pains first appeared.

Simulating the biology underlying those symptoms was even trickier. In general, Schlessinger points out, the deeper operations of a disease — the underlying cascade of molecular and genetic interactions — remain poorly understood. "We couldn't really do this model at the molecular level," he says. "It would be hopeless. Even now it's very complicated."

Instead, Eddy and Schlessinger focused on clinical variables, like the relationship between heart disease and cholesterol level. In practice, these factors exist in a complex web: Heart disease is a function of the amount of arterial plaque, which itself is a function of cholesterol, blood pressure, and weight, plus dozens of other factors. As much as possible, Eddy and Schlessinger tried to replicate the details of that interdependence — following the threads of a biological web to connect, for instance, how a change in blood pressure affected other physiological measures.

Figuring out how to relate all those factors in simultaneous equations required long months of continual adjustment and long stretches of processor time. "It takes a phenomenal amount of preparation to write equations for the working physiologies of individual people," Eddy says. Then he shrugs. "That's what we spent the last 10 years doing."

Once the underlying rules had been established, the next hurdle was to prove that the model actually worked. Without real patients for comparison, there was no way for Eddy to check the reliability of his predictions. Medical decisions are among the most intensely personal choices that we make. What would convince people to trust a machine?

Data: Archimedes; Design: Erik Carnes

Around 1998, determined to prove the model's merits, Eddy hit upon the idea of running a series of validation exercises, simulations designed to test Archimedes' accuracy against the results from real-world clinical trials.

He published the results of 74 different exercises in 2003. In all but three cases, his model churned out results within the range of uncertainty of the actual trials. Overall, the correlation coefficient was 0.99 — a score that prompted Eddy to boast that the equations "really do seem to represent what Mother Nature is doing."

Encouraged by these numbers, a slew of government agencies and pharmaceutical companies rushed to get on board. A few years ago, the Centers for Disease Control commissioned Eddy to determine whether a cardiovascular polypill — a combination of aspirin, a statin to lower cholesterol, three blood pressure medications, and folic acid in a single pill — should be given to all patients over age 55.

Driving the CDC's interest was a 2003 report by two British epidemiologists who argued that the polypill had the potential to dramatically reduce heart attacks and strokes in older patients and people with cardiovascular disease. But the study left many questions unanswered, such as whether patients should start taking the pill at 55 or 65 and whether the multi-ingredient formula selected by the authors was the best one. There was also the question of how much the polypill would cost compared with current treatments. The calculation had to weigh the expense of a prescription medication, with its associated office visits and lab tests, against the money that would be saved by preventing strokes and heart attacks, with their associated hospital costs and long-term complications.

All of those variations were critical, Eddy noted, yet none could be easily studied with a conventional clinical trial. "To answer each of those questions with clinical trials would require at least five years and half a billion dollars," he says. "You just can't do it. Yet they are things we need to know!"

For two months, an Archimedes team tinkered with the particulars of the simulation, which tracked 50,000 virtual patients over a simulated span of 30 years to gauge how many heart attacks the polypill could prevent in the US. This wasn't obvious: While the authors of the British study had just combined the effects of each individual component given separately to get a total improvement of 80 percent, Eddy wanted to investigate — among other things — the interaction between the drugs. For example, would one ingredient potentially limit the effectiveness of another? Though Eddy won't divulge the results, pending publication of the study next year, Lynn Etheredge, founder of the Health Insurance Reform Project at George Washington University, predicts that the simulation, if consistent with early studies, could change the standard of treatment for millions of people worldwide. "Fewer heart attacks, millions of dollars saved ... Whether Eddy will get a Nobel Prize, I'm not sure," he says. "But he's certainly changing how we think about health care."

III.

Others agree that Eddy is changing things, just not for the better. So far, some epidemiologists have argued, Archimedes has predicted only trials that turned out as everyone expected them to — ones where all Eddy did, in effect, was to predict the fifth point on a line where four points were already known. That list includes the Cards trial. Zeke Emanuel, health policy adviser for the Obama administration and a leading proponent of evidence-based medicine, calls Eddy's creation "sophisticated but speculative." Emanuel says that "when you see the demo, there's a certain 'wow' factor. And the fact that it has been able to predict some clinical trials is intriguing. But most of us would then want to say, 'OK — let's try it on this problem, which isn't one that you picked personally.' Like any good presenter, presumably the results that Eddy shows are selective."

Eddy's secretive habits are also troubling, according to David Nathan, director of the Diabetes Center at Massachusetts General Hospital. "If you listen to David, he has 10,000 variables and differential equations describing everything from blood sugar to office furniture," Nathan says. "But it's never quite clear what they are or how they interact. All the calculations happen inside a black box. And that's a problem because there's no way to tell whether the model's underlying assumptions are right."

Eddy tends to answer such criticism by citing his record of success — specifically the 74 validations, which he maintains were chosen for their difficulty. "The trial validations show that the model reflects the reality of how diseases progress," he says. "Whether or not we know exactly what drives them."

This argument has so far failed to convince many clinicians. Several epidemiologists, who didn't want to be named, insist that Eddy has remained frustratingly tight-lipped about the details of his creation. "In the end I think he just wants to guard his property," one said. It's a particularly ironic critique, given his history as a champion of hard data and clear evidence.

In the meantime, Eddy has continued to add new levels of medical detail. To improve the model of coronary atherosclerosis, for example, he recently updated his equations to take into account advances in our understanding of unstable plaque. That capacity to absorb new knowledge as it becomes available is critical, according to Eddy, because it allows the model to grow. Some, however, argue that such changes constitute implicit admissions of failure: an acknowledgment that Archimedes, by overlooking many of the body's underlying molecular processes, may be missing crucial details — with potentially disastrous consequences.

There's some evidence of this. A few years ago, Archimedes predicted that drugs like the new Torcetrapib, which worked by raising "good cholesterol" levels, would prevent about 15 percent of heart attacks in older people. In fact, the opposite proved true. A clinical trial evaluating the drug had to be halted when it became apparent that patients taking it were actually more likely to die from a heart attack than those in the control group.

Eddy doesn't dispute this. "The model will make mistakes," he says. "Mother Nature always has surprises up her sleeve." For the same reason, Eddy notes, Archimedes will never replace clinical trials when it comes to evaluating the safety and effectiveness of untested medications. "The model is a living thing. As medical knowledge and data expand, the model is updated, just as medical experts update their own knowledge," he says bluntly.

Theoretically, Eddy argues, Archimedes' mistakes could actually become virtues, by enabling researchers to identify where their understanding of human biology is faulty. Etheredge agrees. "When Archimedes is wrong, that's when it's going to get really interesting," he says. "We've put everything we know about physiology into Archimedes — and it gave us the wrong answer! What does that mean? It means we're missing something. So it can actually help us figure out where we have a gap in our knowledge."

That assumes, of course, that the model itself is not the source of the mistake. Eddy has argued that Archimedes is simply too complex for most researchers to grasp — relying as it does on a string of equations that are unintelligible to the average clinician. But the stacks and stacks of unexplained equations are precisely what worry critics. Because every piece in Archimedes is linked to every other through a series of simultaneous equations, changing one variable has the potential to cause cascading complications. A new piece of information about the way arteries work could throw off an estimation about the blood flow in another part of the model. At best, this could make Archimedes less nimble. But some modelers worry that it could also turn Archimedes into a mathematical kludge: a piece of software built on the shifting sands of medical knowledge and kept running through complex rewiring — to the point where even the original architects can't follow all the links and assumptions.

Which raises a question: Is Archimedes like medical knowledge itself, growing richer and closer to reality with every added layer of detail, or is it more like an ambitious Rube Goldberg contraption, functioning in spite of itself but only as long as all the parts can be kept in balance?

Asked exactly that, Jonathan Brown, who models diabetes at Kaiser's Center for Health Research, pauses for so long that it seems the phone line has gone dead. "Real physiology is incredibly complex," he says finally. "And our understanding of disease is changing almost week by week." He sighs, then adds, "The risk with a model like Archimedes is that it may just end up codifying our ignorance."

IV.

The issue of whether Archimedes codi\0xFB01es our ignorance — or saves us from it — may matter less than it seems. Since all of Archimedes' experiments are in silicon, people aren't going to get hurt in trials that run awry. The only real risk would be if doctors or medical policymakers came to rely on the model too heavily when making decisions. And to avoid that risk, Brown argues, any model aspiring to change medical treatment should be tested the way a new medication would be: through independent trials that can evaluate its predictive accuracy against a wide range of real, ongoing trials selected by outsiders.

For now, Archimedes just continues to grow. Eddy recently expanded the model, which can currently simulate 16,500 person-years a minute, to include several new diseases, among them breast and lung cancer, and will soon launch ARCHeS, an online interface that will allow physicians, policymakers, and researchers to access Archimedes and design their own trials. Last year, his company began a project that will enable doctors to create Archimedes doubles of individuals and then show patients the effects of different treatments. The program is currently being tested at several of a major health care provider's clinics, with results expected next year.

As hospitals begin the switch to electronic records, moreover, demand for Archimedes' services is likely to skyrocket, with doctors clamoring to run virtual trials using information from actual patients. (Fortuitously, Kaiser currently has the country's largest repository of electronic medical records.)

In the meantime, business is booming, with the company's computing hours booked well into 2010. "The model will learn," Eddy says cheerfully. "And so will the people using it."

Contributing editor Jennifer Kahn (jennifer_kahn@wired.com) wrote about geneticist French Anderson in issue 15.10.