For most of America’s history, the idea that people over the age of 65 would voluntarily herd themselves into special communities built around their needs would have seemed absurd, even dystopian. Yet a largely voluntary movement toward segregating people by age has reached extreme levels in recent years — and without receiving much attention at all. The coronavirus outbreak could put an end to it.

In 1850, nearly 70 percent of individuals age 65 or older lived with their adult children. Most of the rest tended to live in close geographical proximity. As a consequence, older people were more or less evenly distributed throughout the country.

This arrangement was highly functional: The elderly needed help as they aged, and children and grandchildren provided it. In return, the elderly took care of young children, and otherwise pulled their weight around the house.

Home was not the only place where people of different ages mixed together in ways they are all too rare today. Prior to the 20th century, it was entirely normal to have a one-room schoolhouse catering to both teens and toddlers. When rural communities held quilting parties, everyone from young girls to elderly matrons participated side by side. Farmworkers of all ages toiled together, and armies in the U.S. Civil War threw together young boys, older men and everyone in between.

This was a world with very limited “age consciousness.” Almost no one drew attention to their age, even on their birthdays, a ritual that took off in the 20th century. As countries like the United States industrialized, new institutions began sorting citizens into different age buckets. Most importantly, schools began catering to discrete age cohorts — elementary, junior high, senior high.

Much of this shift coincided with the invention of new terms to define and distinguish different age groups. The idea of “middle age,” for example, was a product of this shift, as was the invention of “pediatrics” as a field of medicine. It was perhaps inevitable that the elderly would get lumped into their own cohort, with a new field of medicine — geriatrics — invented to tend to their needs.

Several developments fueled this trend. The first was a growing belief that older people couldn’t keep up in the fast-paced, modern world of work. Mandatory retirement ages — often coupled with increasingly generous pension benefits — helped push workers out the door at a certain age. When Congress passed the Social Security Act in 1935, it elevated a new threshold to almost totemic significance: 65.

All of this took place against a very gradual decline in the number of old people living with their children. By the 1930s, the percentage of elderly whites living with their children declined to just under 40 percent; by century’s end, it had fallen all the way to 13 percent.

The mix of public and private retirement programs enabled some of the elderly to live on their own, but there’s evidence that in many cases, children moved away from their parents to pursue economic opportunities, effectively abandoning the older generation.

So the elderly, particularly those with retirement savings, embraced a new trend that burst onto the scene after World War II: the retirement community. It offered a model for all the big retirement communities to come. These gated communities deliberately excluded younger people. But this meant there was no need to pay taxes for schools.

This movement exploded in the succeeding decades. But not everyone was wealthy enough to afford such amenities. Others weren’t well enough. In 1965, Congress created Medicare and Medicaid, helping finance the creation of low-budget, state-run “nursing homes” that increasingly warehoused the elderly.

These developments led to older generations living apart from everyone else. Though this took place in other developed nations, the U.S. was particularly committed to the effort.

Ultimately, the U.S. became one of the most age-segregated nations in the world. Recent research indicates that a third of Americans over the age of 55 live exclusively among people in the same age cohort.

There are many reasons why this trend is problematic. A growing body of research suggests segregating people by age isn’t healthy for anyone, young or old, and that it has helped fuel divisions in the nation’s politics: When generations live apart, political polarization follows: The 2016 election comes to mind. But these concerns, rarely articulated, haven’t come close to raising societal alarms.

The pandemic may change that. Our most vulnerable members of society are concentrated into communities and institutions that, once infected, can easily turn into catastrophes. These are places where people live in close quarters, sharing meals, socializing and otherwise living in ways that are apt to facilitate the spread of the virus.

Indeed, while much of our attention is now focused on large-scale outbreaks in cities like New York, the next wave may be dominated by a smaller, but proportionally more lethal, outbreaks throughout the nation’s elderly enclaves. There are signs this is already happening.

Such a disaster may finally make us question why we ever thought it was a good idea to so thoroughly segregate ourselves by age. If so, something good may come of this pandemic yet.

Stephen Mihm is an associate professor of history at the University of Georgia.