The future of war could be drastically different than the wars of today, and we'd have artificial intelligence (AI) to "thank" for it. I use "thank" in quotes, because the potential of AI war machines is truly scary, and it makes me ill to think of their potential. Closing my eyes, I can envision future news stories of how AI-controlled drones and similar equipment result in the loss of many innocent lives.

I might sound crazy, but I have Stephen Hawking, Elon Musk, Steve Wozniak, among myriad other brilliant scientists and researchers to back me up. Decades ago, AI-controlled war machines were restricted to the movies. Through the years, there have been countless jokes made about "Skynet" being created, and here we are, at a time when it's actually possible.

Future of Life, a group of AI and robotics researchers have been at the forefront of explaining to the world what can happen if we let autonomous combat happen. You might recall us mentioning Future of Life back in January, as Elon Musk donated $10 million in the form of grants to help get some serious research going.

In an open letter, the group explains that autonomous weapons could become the "third revolution" in warfare, right after gunpowder and nuclear arms. The benefit of AI over nuclear arms is that the components required are not difficult to source. If Amazon could create drones smart enough to deliver packages, it's not outside the realm of believability to think that governments would create drones to mow people down. Heck, even regular citizens have affixed handguns to four-rotor drones.

The letter goes on to say that just as chemists and biologists have no interest in seeing their work being used to kill, the same goes for AI researchers. Further, none of them want to see their field tarnished as a result of it, with the further possible drawback being that it could stifle their further development for good things.

An important point raised in the letter is that if one major country decides to begin developing autonomous weapons, it's going to inevitably lead to an arms race, and the result is going to be far from pretty.

"In summary, we believe that AI has great potential to benefit humanity in many ways, and that the goal of the field should be to do so. Starting a military AI arms race is a bad idea, and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control."

You can read the full open letter at the URL below, and if you happen to be involved in any type of scientific work or education, you can sign your own name to the list.