BRANDT AYERS

THE ANNISTON STAR

Having recently joined an outward-looking stock advisory group, last night I dreamt my advisers had whisked me past the twilight zone, past Tier Zero into the realm of Deep Learning.

“Where am I?” I asked a nearby algorithm with a friendly face. He sized me up and with a tired patience replied, “You have entered the kingdom of artificial intelligence.”

Then I woke up, happy to be in a warm bed in the familiar surroundings of my own bedroom. If you, like me, were challenged by high school algebra, you will understand how bewildered I’ve been wandering in the far-out planet of AI.

What I can make of the place, it looks fascinating — revolutionary, about to change everything we think and do in warfare, commerce, manufacturing and even sports.

As I understand it, Tier I is the smartphone, Tier Zero is the future, AI is where machines are taught to be much smarter than human beings — not wiser or more empathetic, just smarter.

Deep Learning is similar to the process of teaching a child to identify a car, then a truck, only quicker and vastly more complicated. Computer chips have been engineered to accept and process billions of images of trucks.

Artificial Intelligence was used in Desert Storm to track the movement of Saddam Hussein’s elite Republican Guard. Recently, in urban fighting against ISIS, an Air Force fighter dropped a smart bomb on the rooftop, which blew out the building but left structures on either side unharmed.

In Arabic countries where fighters faces are covered, AI can be used to distinguish the eyes of enemies from friends. Imagine how AI can be used in aerial combat to anticipate the movement of enemy aircraft or ships at sea.

Walmart already uses a simple robot to tell when the supply of hair tonic has run out and order replacements. Several weeks ago, I checked Amazon to see how my book was selling. I was surprised to learn the company was urging potential buyers to “order immediately, only 1 copy remained in stock.” My publisher informed me that was a marketing ploy, some additional copies were in the pipeline.

Of course, there are ethical questions raised by AI, especially in combat and law enforcement disciplines, and there are scary scenarios.

The basic ethical question for the military and law enforcement is: Should machines be allowed to kill? Machines already do; drones kill, but only on orders of a human.

Should they be allowed to fire deadly weapons on their own? In certain defensive situations, the answer might be, yes. For instance, a sophisticated marine robot could scout the location of enemy forces, programmed to return fire if fired upon.

Similarly, a law enforcement robot might hunt a maniacal killer and fire if it was threatened. It is conceivable that in certain highly dangerous situations, a robot could be programmed to initiate hostile fire.

It would be a foolish commander to risk such a valuable asset in any but the most dire circumstances.

Society will have to be alert to the potentially evil applications of artificial intelligence. For instance, ironclad protocols will have to be developed to prevent the development of super-intelligent children and superhuman hostile nations.

The technology will have to be kept out of the hands of clever enemies such as ISIS and the world will have to enforce laws against the development of superhuman hostile nations.

In benign peacetime situations, the uses of AI could be infinite. Think of how an imaginative educator could use AI in classroom settings. And imagine an AI refrigerator that prints out a grocery list of every item from eggs to mushrooms that are running low.

In medicine, robots are already performing surgeries, and in time might be able to cure cancer and other diseases by telling the difference between good germs and bad ones, destroying only the bad ones.

It would not surprise me to learn that Alabama coach Nick Saban had used AI to program dozens of Florida State films to reveal every tendency of the Seminoles on pass defense and offensive situations. Professor Alan Fern at plucky Oregon State has already been using football-focused AI.

Our educational system will have to adapt, retraining workers replaced by machines in the expanding new industry of the care and maintenance of robots.

A bewildering future is crowding in on us that will bring about dramatic changes, but it will be humans with a sense of what the law and humanity can tolerate who determine whether we use it for good or evil.

H. Brandt Ayers is the publisher of The Anniston Star and chairman of Consolidated Publishing Co. Write to him at P.O. Box 189, Anniston, AL 36202-0189.