by | |

Lyric of the week:

No one likes us

I don’t know why

We may not be perfect

But heaven knows we try

But all around

Even our old friends put us down

Let’s drop the big one

And see what happens

— Randy Newman, “Political Science”

We are rightly focused on the consequences of having hard-to-imagine destructive force at the voice command and finger-tweeting powers of any single individual. This circumstance is impossibly absurd. We’ve always known this, and now we understand it all too well.

Legislation has been proposed by Congressman Adam Smith, Ted Lieu, Senator Ed Markey and others to have other responsible figures engaged in the most consequential, time-constrained decision making that any human being could possibly face. Requiring the addition of two more individuals to the decision-making process for the first use of nuclear weapons ought to be an easy sell. It’s essential in my view and in the view of Richard Betts and Matthew Waxman, Bruce Blair and others. This isn’t a partisan issue. Nor is it a deterrence issue; we’re talking, after all, about first use, not about retaliation. It’s an issue that begs us to apply plain and simple common sense.

Herb York has written powerfully on this subject, extending the first use dilemma to second use — a dimension of the problem that most of us tend to avoid. But even second use raises deeply disturbing moral dilemmas about vengeance and escalation control, which helps to explain why we rely so heavily of deterrence: that way, we don’t have to dwell on either the first or second use dilemmas. It would be better for more than one human being to be in this loop, and it would be worse to leave this decision to machines that are also prone to error.

Herb’s book, Race to Oblivion: A Participant’s View of the Arms Race (1970), offers sound warnings and advice. When the answers to a terribly hard problem are unacceptable, Herb tells us that we are probably investigating the wrong problem. If the prospect of first and second nuclear use appear to be unacceptably bad, it stands to reason that addressing the conditions for “no use” require greater attention.

These requirements are political as well as military. Indeed, without the “political” piece, fulfilling military requirements will be insufficient, expensive, and dangerous. The political piece used to be called nuclear arms control, terminology that is now out of fashion and is in the process of being trashed, thanks to the decisions by George W. Bush, Vladimir Putin and Donald Trump. The task of filling the void left by treaty trashing lies before us.

Here’s what Herb had to say about the first and second use dilemma:

“The steady advance of arms technology may be leading us not to the ultimate weapon but rather to the ultimate absurdity: a completely automatic system for deciding whether or not doomsday has arrived.

“To me, such an approach to the problem [of launch on warning] is politically and morally unacceptable, and if it really is the only approach, then clearly we have been considering the wrong problem. Instead of asking how Minuteman can be protected, we should be asking what the alternatives to Minuteman are. And, much more important, instead of blithely moving toward a balance of terror of a still more gruesome and precarious sort, we should be asking what the alternatives to a balance of terror might be.

“The theoretical alternative is to require that a human decision-maker at the level of ‘the highest authorities’ be introduced into the decision-making loop. But is this really satisfactory? We would be asking that a human being make in just a few miniutes a decision to utterly destroy another country. (After all, there is no point in firing at theier empty silos.) If, for any reason whatever, he were responding to a false alarm or to some kind of smaller, perhaps ‘accidental’ attack, he would be ensuring that a massive deliberate attack on us would take place moments later. Considering the the shortness of time, the complexity of the information and the awesomeness of the moment, the President would himself have to be properly preprogrammed in order to make such a decision.

“Those who argue that the command and control system is perfect or perfectable forget that human beings are not.”