For indispensable reporting on the coronavirus crisis, the election, and more, subscribe to the Mother Jones Daily newsletter.

This story was originally published by Food and Environment Reporting Network.

Antibiotic-resistant infections—everything from gastrointestinal illnesses to recurring urinary tract infections and staph—are among the most menacing issues in public health today, sickening 2 million people a year and killing at least 23,000, according to the Centers for Disease Control and Prevention. So perhaps it’s not surprising that government has begun to take steps to limit antibiotics in animal agriculture, where many of these infections arise, before they wreak further havoc in humans.

Ever since antibiotics were invented, their remarkable ability to conquer disease has overshadowed the fear that overuse and misuse could produce infections resistant to the powerful drugs. By the 1950s, McKenna tells us, meat processors dipped chicken carcasses in antibiotic slurries, a process known as “acronizing,” to keep meat “fresh” for weeks. This was marketing-speak for sanitizing tainted meat from sick birds. At the same time, the widespread use of antibiotics in livestock feed was propelled by the discovery that animals would grow much faster on the drugs, boosting efficiency and lowering costs.

Ever since antibiotics were invented, their remarkable ability to conquer disease has overshadowed the fear that overuse and misuse could produce infections resistant to the powerful drugs.

As early as 1956, however, antibiotic-resistant infections in humans were linked to farming practices. McKenna tells the long forgotten story of a physician at the Seattle Department of Public Health, who was asked to look into an epidemic of staph infections among blue-collar workers. The resistant infections appeared random until he discovered that all the men worked at the same poultry processing plant, where “poor-quality” chickens ridden with abscesses and pus from infections were dipped in an antibiotic solution. The bath killed all bacteria except those resistant strains of staph that had evolved on farms, where well-meaning farmers had fed their poultry antibiotics. Those bacteria then infected the workers, causing severe lesions which were immune to treatment with antibiotics.

Two decades later in Massachusetts, an experiment (funded by the meat industry) definitively proved that chickens given antibiotics produced resistant bacteria—within two weeks. Those bacteria were transferred to the people taking care of the chickens, which in this case was a family with a backyard farm. But McKenna points out that this carefully constructed study never got the attention it deserved because the family did not get sick; they simply carried the resistant strains in their body. Industry said, in effect, “No problem!”

But in other places sickness lurked and spread, notably in the United Kingdom, where a spate of deadly resistant infections led to government action. In 1977, FDA commissioner Donald Kennedy decided to follow the UK’s lead and ban growth-promoting antibiotics from American agriculture because of the health implications. Animal feed manufacturers and livestock producers responded by pressuring Congress, which in turn threatened to cut the agency’s budget. Kennedy backed down. One determined lawmaker even inserted a clause into an appropriations bill that blocked any restrictions on the use of antibiotics in livestock until he was convinced about the risks to public health. The lawmaker, Rep. Jamie Whitten from Mississippi, never was convinced, and he renewed the rider every year until he retired in 1995. As McKenna writes, “Witten’s obstinance on behalf of agriculture cemented the security of farm antibiotic use in the United States.”

The infections passed from animals to people via meat, and then from person to person.

Even during this period of resistance from lawmakers and others, the evidence of health problems with antibiotic use continued to mount. In 1984, a study appeared that linked antibiotic-resistant infections and deadly food-borne illnesses to antibiotic use on farms. The infections passed from animals to people via meat, and then from person to person. Other earlier experiments had shown how snippets of DNA migrated among these organisms, so that bacteria that were resistant to one drug could suddenly develop resistance to multiple drugs. Doctors became nearly powerless in the fight against these resistant infections, as one drug after another became impotent. Yet even as the body of science expanded, it wasn’t until 2000 that the FDA took its first tentative steps to withdraw one type of antibiotic from use by livestock producers. Big Pharma promptly sued the agency.

Fast forward to the Obama administration, when the FDA finally moved to restrict antibiotics through voluntarily measures. Coming after years, or really decades, of concern, the momentous decision felt a little anti-climatic. And in a way, it was. By then, McKenna explains, the most forward-thinking producers, like Perdue and Bell & Evans, were already moving away from antibiotics. With much improved farming practices, such as cleaner barns and better care of chicks even in industrial facilities, the drugs no longer provided the efficiency bump they once did. In short, the main rationale for using them, faster weight gain, was gone. It was something that European producers had discovered as well.

While much of the livestock industry still uses growth-promoting antibiotics, the trend in wealthier nations is evident. Consumers, and the corporations that serve them, want meat produced without antibiotics. So industry has begun to change its model.

But the cautionary tale here is that government was clearly behind the curve, never ahead of it. The lack of restrictions on antibiotic use in livestock meant that agencies like the CDC had to work heroically to identify and clean up deadly infections that were regularly sickening and killing many thousands of people. But these were Band-Aids at times of crisis. Industry consistently undermined any big policy changes, which left the CDC disease detectives as the last line of defense. It raises the question of whether expectations for government-led action on this or any other public health issue—let alone climate change—are misplaced. Perhaps consumer demand and industry innovation are the most potent levers after all.