The U.S. campaign against the use of inhumane weapons began not with the interwar debates about the prohibition of chemical weapons after Somme and Verdun, or when the U.S. Senate refused to ratify the 1925 Geneva Protocol on Chemical Weapons, but with the arrival of the atomic age. After Hiroshima and Nagasaki, Americans viewed the bomb as the fruit of their own ingenuity, its use justified as the only means to end World War II at a minimal cost to American lives. In a poll conducted by Fortune magazine four months after the bombings of Hiroshima and Nagasaki, more than four out of five respondents approved of the bombings, with nearly one in four wishing more bombs had fallen. (One woman explained: There “may be innocent women and children, but they only in my opinion breed more of the same kind of soldiers.”)

John Hersey’s issue-length account of six individuals struggling to survive in the ruins of Hiroshima, published in The New Yorker just over a year after the bombing, sought to subvert that callous narrative. Hersey set part of his piece in the city’s Red Cross hospital, where its shattered residents shuffled like zombies after the bombing. He concluded with a quotation from the diary of 10-year-old survivor Toshio Nakamura: “Next day I went to Taiko Bridge and met my girlfriends Kikuki and Murakami. They were looking for their mothers. But Kikuki’s mother was wounded and Murakami’s mother, alas, was dead.”

By humanizing the victims, Hersey helped inaugurate a reappraisal of total war’s morality; gradually, American approval for the atomic bombings declined. When former Secretary of War Henry Stimson told Harper’s magazine the following February that he and President Harry Truman had believed that dropping the bomb in Japan would provide “an effective shock [that] would save many times the number of lives, both American and Japanese, than it would cost,” he implicitly conceded Hersey’s basic point—human lives, regardless of nationality, were the only measure by which to judge such instruments of death.

American attitudes toward nuclear weapons would shift, however, after plans to place atomic energy under international control fell apart with the onset of the Cold War. When the Soviet Union entered the nuclear club in 1949, Truman refused to heed his General Advisory Committee’s counsel not to build the hydrogen bomb, which the body had likened to “a weapon of genocide.” Phrases such as “massive retaliation,” “brinksmanship,” and “megadeaths” entered the strategic lexicon as the nuclear arsenal mushroomed from 1,000 to 24,000 warheads under Dwight Eisenhower. The muscular grammar of deterrence, credibility, and risk-taking initiated a contest to see which politician could talk the toughest.

But Eisenhower saw the dangers that lay ahead. In his “Chance for Peace” speech in 1953, he called military spending “a theft,” complaining that “[t]he cost of one modern heavy bomber is this: a modern brick school in more than 30 cities.” He warned in his farewell address that “we cannot mortgage the material assets of our grandchildren” to feed the country’s growing military-industrial complex. Ironically, Eisenhower’s messages of caution, delivered in the language of domestic protection, laid the groundwork for what was to come.