We need more doctors. On a global scale, the shortage is staggering: The World Health Organization says we need 15 percent more doctors. In the United States, the American Association of Medical Colleges estimates the current deficit at almost 60,000 and forecasts a worrisome 130,600-doctor shortfall by 2025. There’s one simple solution: We have to consider ways to manufacture doctors faster and cheaper.

An American physician spends an average of 14 years training for the job: four years of college, four years of medical school, and residencies and fellowships that last between three and eight years. This medical education system wasn’t handed down to us by God or Galen—it was the result of a reform movement that began in the late 19th century and was largely finished more than 100 years ago. That was the last time we seriously considered the structure of medical education in the United States.

The circumstances were vastly different at that time. Until the Civil War, private, for-profit medical schools with virtually no admissions requirements subjected farm boys to two four-month sessions of lectures and sent them off to treat the sick. (The second session was an exact duplicate of the first.) The system produced too many doctors with not enough training. Abraham Flexner, the education reformer who wrote an influential report on medical education in 1910, put a fine point on the problem: “There has been an enormous over-production of uneducated and ill trained medical practitioners,” he wrote. (Emphasis added.) “Taking the United States as a whole, physicians are four or five times as numerous in proportion to population as in older countries like Germany.”

In other words, our current medical education system was originally designed to reduce the total number of people entering the profession. The academic medical schools that sprang up around the country—such as the Johns Hopkins Hospital in 1889—made college education a prerequisite. Medical school expanded from eight months to three years and solidified at four years in the 1890s. Postgraduate training programs were implemented, beginning with a one-year internship. These were brilliant reforms at the time.

Over the past century, there have been additions to, but few subtractions from, the training process. Residency and fellowship programs became longer and longer … and longer. The path to some specialties is now almost comically arduous. Many hand surgeons, for example, complete five years in general surgery, followed by three years in plastic surgery, followed by another year of specialized hand surgery training. To be a competitive candidate for a hand surgery fellowship, it’s also strongly recommended to spend two additional years on research at some point during the process.

The current system has costs beyond making doctors expensive and rare. The long process doesn’t just weed out the incompetent and the lazy from the potential pool of physicians—it deters students who can’t pay for so many years of education or who need to make money quickly to support their families. That introduces a significant class bias into the physician population, depriving a large proportion of the population of doctors who understand their background, values, and challenges.

One solution is to simply lop off a few years from the process. Writing in the Journal of the American Medical Association in 2012, bioethicist Ezekiel Emanuel (one of those Emanuels) and economist Victor Fuchs recommended shortening each stage by about 30 percent. Four years of premedical training shouldn’t be a requirement for those who don’t want it or can’t afford it, they argued. The fourth year of medical school is largely a breeze, and a few progressive medical schools are now offering three-year programs to reflect that reality.

As for postgraduate training, Emanuel and Fuchs attacked the increasingly common requirement that residents and fellows complete laboratory or clinical research projects. They don’t buy the popular ideal that every doctor must be a “physician-scientist.” Referring specifically to surgeons, they wrote, “The most important factor in becoming a competent surgeon is high volume—performing specific procedures many times over. A research year does not add to surgical volume and skills building.”

Shortening the training process would entail costs. Kenneth Ludmerer, a professor of medical history and author of two books on this topic, argues that research isn’t merely about scientific discovery, but learning to approach diagnosis and treatment like a scientist. He points out that even Abraham Flexner, writing more than 100 years ago, noted: “The practicing physician and the ‘theoretical’ scientist are thus engaged in doing the same sort of thing even while one is seeking to correct Mr. Smith’s digestive aberration and the other to localize the cerebral functions of the frog.”

“There is an inevitable tension in medical education between preparation and practice,” Ludmerer says. “It is a perpetual dilemma that has become more severe, because there is now more to know.”

Another solution, perhaps more elegant, is the outcomes movement. American medical schools and residency programs have traditionally relied on the “tea steeping” method: They expose students to information for a prescribed amount of time, and assume they’re ready at the end of it. Years can be added if a student demonstrates gross incompetence in exams, but there’s no opportunity for exceptional students to accelerate the process. Offering that chance makes educators uncomfortable—both because it relies heavily on imperfect examinations and because it partially undermines the traditional process—but it’s time to experiment.

“Experiment” is the key word. The fundamental problem here is that the argument between traditionalists and reformers is essentially theoretical—we are in an evidence vacuum. It’s ironic, because in virtually every other aspect of medicine, tradition and intuition were discarded decades ago. Researchers rigorously test what is the best moment to start someone infected with HIV on antiretrovirals or a patient with high cholesterol on statins. But doctors have very rarely examined their own training. When Emanuel and Fuchs published their proposal two years ago, they could find just a single study comparing the competence of physicians from the traditional four-plus-four medical education system with that of doctors from shortened programs.

There is no reason not to do this important research. More than a dozen medical schools now offer high school graduates the chance to earn a medical degree in six or seven years. Fellowship programs also vary in length. It’s time to compare the medical boards scores, patient mortality rates, and other metrics for doctors with different lengths of training. The studies won’t be easy—students entering shortened programs may be different in a number of ways, for example, biasing the outcome. But assiduous matching of the test and control groups, paired with honest statistical analysis, will partially address that problem.

The rank-and-file physician may herself be an impediment to reform. Every generation of doctors seems to be convinced that the next is inadequately trained, because the younger doctors didn’t live in the hospital or spend enough sleepless nights there. Many warn that shortening premedical education will inevitably produce awkward automatons who can’t relate to patients (as though the current system is flawless in that regard).

In recent years, however, studies have shown that reductions to working hours during residency have harmed neither patients nor doctors-in-training. We need to subject assumptions about duration of training to the most rigorous scientific assessment possible. It’s time for doctors to turn the microscopes on themselves and their own training, and accept that the system that produced them may be imperfect. It’s nothing against you, D octor, it’s just a scientific inquiry.