Science supported this idea. Beginning in the 1980s, a number of studies of marathon and ultramarathon runners had found that many of them reported developing colds in the days and weeks immediately after their race. Their incidence of illness was much higher than among their nonrunning family members or the general population.

With those findings as a backdrop, other scientists began to look at the working of the immune systems of athletes during and after draining events. Their research showed that changes occurred, some of them drastic. During an event such as a marathon, for instance, immune cells would begin to flood the bloodstreams of the athletes, apparently flushed there from other parts of the body as heart rates rose and blood sluiced more forcefully through various tissues.

By the time the race ended, the runners’ bloodstreams would teem with extra immune cells.

But within a few hours, the numbers of many such immune cells in the bloodstream would crash, researchers found, typically falling to levels far lower than before the event.

The scientists interpreted these findings to mean that the runners’ physical exertions had killed large numbers of their immune cells and created what some researchers dubbed an “open window” of immune suppression that could allow opportunistic germs to creep in, unopposed.

That idea became established doctrine in exercise science and sports.

But recently, health researchers at the University of Bath in England grew skeptical. From an evolutionary standpoint, they reasoned, immune suppression after strenuous exercise made little sense. Early humans often had to chase prey or flee predators, opening themselves to injury. If they experienced a weakened immune response at the same time, they were in serious jeopardy.