Gradually, perhaps imperceptibly, automated systems will function so much more efficiently that humans will become mere bystanders. The soldier will become the slowest element in an engagement, or will simply become irrelevant. Adherence to the rules of war will become less relevant as well.

A separate set of ethical questions are raised by the technologies of human “enhancement” and augmentation, which include improving physical strength, stamina and pain tolerance, as well as using neurological implants and stimulation to restore brain function and enhance learning.

Can soldiers under the influence of behavior-modifying drugs or electronics be held to account for their actions? If the soldier is using drugs to enhance his cognition or reduce his fear, what is the role of free will? Might a soldier who fears nothing unnecessarily place himself, his unit or innocent bystanders at risk? What about the impact of memory-altering drugs on the soldier’s sense of guilt, which might be important in decisions about unnecessary and superfluous suffering?

These are important decisions in war, and they form the basis for many of the tenets of “just war” theory. Gen. Paul Selva, the vice chairman of the Joint Chiefs of staff, supports “keeping the ethical rules of war in place lest we unleash on humanity a set of robots that we don’t know how to control.”

The role of revising and recasting these conventions should be taking place at the highest levels of government. So far, it hasn’t. The White House’s Select Committee on Artificial Intelligence, formed in May, has not even acknowledged the major ethical issues surrounding A.I. that have been very publicly raised by an increasing number of scientists and technology experts like Elon Musk, Bill Gates and Stephen Hawking.

While it is important that leaders openly recognize the critical nature of these issues, the Department of Defense needs to follow up on its 2012 directive on autonomy with guidelines for researchers and commanders. It should require that both researchers and military commanders question — throughout the development process and long before the systems are ready for deployment — how the systems will be used and whether that use might violate any of laws of armed conflict and international humanitarian law.

Historically, the United States has led the world in technology development, and thus our use of questionable weapons or methods will be well noted by others. Sadly, in the period after the Sept. 11 attacks, the United States resorted to torture of enemy detainees. While most senior leaders have denounced the practice, the fact remains that the nation crossed an important moral threshold. Knowing that, future enemies — even civilized ones — may be less inhibited in employing the same methods against us. The same will be true for advanced technologies.

With new warfare technologies, we now have an opportunity to again demonstrate our leadership in human rights by ensuring that our young soldiers know how to use the new weapons in ethical and humane ways.