“I am literally willing to put hundreds of thousands of people at risk, knowing that millions and millions of people will be at risk if we don’t,” Graham said.

A decade and a half ago, the Bush administration built a much more public and concerted case for war with Iraq than the Trump administration so far has for war with North Korea. But it marshaled some of the same arguments. U.S. officials claimed (largely without basis, it turned out) that Saddam was amassing chemical and biological weapons and seeking nuclear weapons—and that his exceptionally reckless and hostile regime could either use those weapons, provide them to terrorists, or exploit them as “blackmail.” Iraq’s arms buildup was presented as a gathering storm that years of diplomacy hadn’t managed to clear up; the Bush administration cautioned that as the skies grew darker, America’s options for defending itself and the world had to as well.

“We are now acting because the risks of inaction would be far greater,” Bush asserted 48 hours before invading Iraq. “In one year, or five years, the power of Iraq to inflict harm on all free nations would be multiplied many times over. … In the 20th century, some chose to appease murderous dictators whose threats were allowed to grow into genocide and global war. In this century, when evil men plot chemical, biological, and nuclear terror, a policy of appeasement could bring destruction of a kind never before seen on this earth. Terrorists and terrorist states do not reveal these threats with fair notice in formal declarations. And responding to such enemies only after they have struck first is not self-defense. It is suicide.”

There are numerous differences between the grounds for war with Iraq then and North Korea now; North Korea, for instance, actually has a robust stockpile of nuclear, chemical, and biological weapons. But the common theme is a concept that was forged in the trauma of the September 11 attacks and clearly retains some appeal: that there are circumstances in which the United States must take military action not to respond to an attack or pre-empt an imminent one, but to prevent a threat from materializing in the future. For decades after World War II, U.S. policymakers shunned this sort of “preventive” war because of its associations with the aggression of Nazi Germany and Imperial Japan, as my colleague Peter Beinart has written. But that shifted with the end of the Cold War, as America grew more confident in its military strength, and especially after 9/11.

Following those attacks, Bush was determined to prevent anything like it from happening in the future. “On 11 September, 2001, America felt its vulnerability,” Bush observed months before ordering the assault on Iraq. “We resolved then, and we are resolved today, to confront every threat, from any source, that could bring sudden terror and suffering to America. … Facing clear evidence of peril, we cannot wait for the final proof—the smoking gun—that could come in the form of a mushroom cloud.” This was the intellectual framework for a war unlike any the United States had fought in the past. The mere prospect of its enemies striking first was now sufficient justification for America to take the opening shot instead.