“This study concluded that DoD must accelerate its exploitation of autonomy—both to realize the potential military value and to remain ahead of adversaries who also will exploit its operational benefits,” the DSB study says. Machines and computers can process much more data much more quickly than can humans, “enabling the U.S. to act inside an adversary’s operations cycle.” And that is why it is “vital if the U.S. is to sustain military advantage.” Ruth David, of the National Science Foundation and coauthor of three books on signal processing algorithms, and retired Air Force Maj. Gen. Paul Nielsen, co-authored the study.

Autonomy and human-machine assistance are, of course, core elements of the Pentagon’s Third Offset Strategy.

Vice Chairman of the Joint Chiefs of Staff, Gen. Paul Selva, repeated his cautious embrace of autonomous weapons today at the Center for Strategic and International Studies. Breaking D readers will remember his use of the wonderful term “Terminator Conundrum” to describe the ethical issues the military faces as it allows weapons to make decisions in battle without a human being. Today, I asked him again if the US should pursue treaty or other international restrictions on the weapons and he didn’t address it directly.

Selva appeared to agree with Frank Kendall, the head of Pentagon acquisition, who worries that enemies will not care as much about the ethical niceties of allowing a robot to kill human beings. He said “there will be violators” of any agreement, as there are with chemical weapons and other banned weapons. Syria and Daesh (known to some as ISIL) have both used chemical weapons this month.