If you ask most laypeople what medical schools do, they will tell you that their primary mission is to train people on how to be great doctors. While this is an understandable answer, for most of recent history, it was wrong.

In the 1800s, there were over 400 U.S. medical schools; most were small proprietary schools that operated for profit and were unaffiliated with universities. These schools granted a degree to students that relied on recitation, not understanding human anatomy or physiology. The instructors were poorly-trained, part-time local physicians.

In 1910, the Flexner Report made sweeping recommendations to transform U.S. medical schools. Abraham Flexner proposed that admission to medical school must require a college or university study that was devoted to basic science, and that physicians be trained in a scientific manner. He also proposed that medical schools should appoint full-time professors who would be barred from all but charity practice, but who would control clinical instruction in hospitals, and that physicians should be licensed by state boards. But, Flexner recognized one major flaw in his model, that revenues generated by tuition could never sustain medical schools financially.

Implementation of the Flexner model led to the widespread closure of substandard proprietary medical schools. Medical schools maximized their tuition revenues by selecting medical students (almost always men) from well-to-do families who could afford to pay the large sums needed to sustain a full-time faculty. From 1904 to 1935, the number of U.S. medical schools plummeted from 160 to 66, and the number of U.S. physicians dropped precipitously. Once there was a monopolistic hold on healthcare, the salaries and social standing of medical doctors skyrocketed.

Given the limited support that tuition fees could provide, medical schools needed other sources of revenue. Financial salvation arrived when -- in the 1950s, 1960s, and 1970s -- Congress began allocating vast funds to the National Institutes of Health to promote advances in biomedical research. In response, medical schools reconfigured the Flexner model, and required faculty to obtain governmental research grants to support themselves. The formula was simple: if showered with respect, full-time talented academicians would accept fixed salaries that were a fraction of what they might earn in the private sector.

As a result, during the 1950s, 1960s, and 1970s, the overarching business model and mission for medical schools was the growth and management of a biomedical research enterprise. Research faculty floated medical schools based on the overhead on their grants. With the emergence of subspecialties, the clinical faculty focused their efforts on the training of postgraduate fellows (rather than medical students), because they were able to magnify the faculty's ability to generate a revenue stream. During these years, medical schools proudly proclaimed that they had a tripartite mission -- research, teaching, and patient care. The ideal physician was expected to excel in all three, the so-called "triple threat." In truth, the halcyon days of truly triply-talented faculty were short.

As Flexner expected, tuition represented a tiny fraction of revenues, unless (as in the case of state-supported schools) medical schools were paid handsomely to enrich their enrollment of in-state applicants. Faculty members who had the talent to obtain large research grants or attract large numbers of patients were freed of any requirement to spend any meaningful time with students. Although enjoyable and honorable, the responsibility for teaching fell largely on instructors who could not support themselves as researchers or clinicians.

As the funding of healthcare expanded in the last 30 years -- particularly procedural revenues -- medical schools thought that they could recreate their business models by incentivizing their clinical faculty to grow their practices and referral base. But, doing so led to the creation and growth of massive health systems, giving administrators enormous financial power that did not depend on traditional academic structures or missions. Now, even at the most "academic" of medical schools, the clinical faculty are often paid directly by health systems, their employment and salaries depend on their ability to generate fees, not research papers, and their time is managed by healthcare and service line administrators rather than department chairs or division chiefs.

With health systems in charge, medical schools have become financially starved and have lost control of their faculty. The vestiges of the academic structures are still in place, meaning there are still deans, departmental and divisional leaders, people with educational titles, and students. But, unless they also have authority over the health system or service lines, the men and women in leadership positions are often only figureheads. They are relics of an obsolete structure, which has no funding and little decision-making capacity. Unless you can direct the expenditures of state funds or philanthropic money, you serve the pleasure of an administrator who often has little sympathy for any time spent related to research or teaching. Leadership positions at medical schools are increasingly filled by those whose primary qualification is demonstrated loyalty to the healthcare system, rather than traditional "academic" credentials. All too often, these positions are filled by insiders who can be recruited with minimal resources, and whose personal allegiances are well-established. If academic leaders pursue a parallel "academic mission," they do so at their own peril.

As a result, many deans at modern medical schools serve a ceremonial function. Like the English monarchy, they preside at events and say inspiring things to those in attendance, but they have no power. But, unlike the Queen of England, they typically have no money.

What ever happened to the research mission? In the past, students were often forced to sit through basic science lectures merely because the basic science faculty needed access to state-provided student fees. Now, health systems have little interest in basic research, and they support innovation in clinical medicine only if it is likely to lead to a substantial growth in revenues. "Academic" hospitals (e.g., Vanderbilt) are getting divorces from their own universities.

So, what is "academic medicine" these days? I am not really sure. In many cases, the mission of medical schools is now focused on ... actually teaching medical students!

But, medical education no longer depends on an academic faculty. Many schools provide streaming video of canned lectures, which can be accessed at any time, and do not require the physical presence of students or a professor in a lecture hall. When lectures are given in person, the number of attendees is typically sparse. Indeed, the Larner College of Medicine of the University of Vermont has phased out lectures entirely in favor of "active learning." With active learning, medical students are allowed to absorb the necessary information in their own way and on their own time. Discussions are facilitated by moderators and most of the academic faculty is not involved. The disconnect between medical schools and the core of "academic medicine" is now complete.

If you think that medical schools are exemplars of a bustling and productive environment for teaching, innovation, and patient care, you are living in the wrong century. Traditional medical schools represent a money-losing relic of an honorable past. Even when first proposed in 1910, the Flexner model was never financially viable. It lasted for 100 years simply because schools could tap into resources that Flexner never originally envisioned.

Now, "academic medicine" is dominated by a corporate model that cares little about any intellectual mission. Medical schools often propagate with little relationship to any academic faculty. The ceremonial and organizational vestiges of an interconnected bygone era are still evident, but they no longer mean what they used to.