DETROIT: Uber, eat your heart out. While the private sector struggles to perfect self-driving cars that can operate safely on well-mapped and well-maintained roads, the Army’s Ground Vehicle Systems Center here has developed a standard set of software and sensors that can turn both wheeled and tracked vehicles into off-road robots.

“Our best day is their worst day,” said Bernard Theisen, a senior GVSC engineer, “because we’re going to go in assuming no comms, potentially no GPS, and the terrain that was there yesterday may not be there today.” A dirt road may be washed out by flooding in a disaster relief operation, for example, or cratered by roadside bombs in a war zone. And that’s assuming there ever was a road to where you need to go, never a safe assumption in the Army.

Sure, Theisen told me in an interview ahead of the AUSA Autonomy & Artificial Intelligence Symposium here, “we are looking to leverage” civilian innovation wherever possible. The auto industry – including the Big Three giants just a short drive from GVSC’s main campus – has invested billions in self-driving cars and generated lots of technology the Army can adopt and adapt for its purposes, from collision-avoidance algorithms to cheap, miniaturized sensors.

“Pretty much all the driver warning/driver assist features… we’re basically buying from the commercial automotive industry and adapting them to the military,” Theisen told me. “That’s one of the nice things about being in Detroit.”

But civilian vehicles, whether truly self-driving or simply equipped with driver-warning features like lane departure warning, can rely on things like marked lanes, like paving and curbs, like access to GPS and mapping applications. In fact, one of the big attractions of 5G networks is how their much greater bandwidth might finally make self-driving cars safe on the open road.

By contrast, a military vehicle, whether it’s a supply truck or a main battle tank, might not even have a map to rely on. In fact, the software GVSC has developed includes an exploration mode where the vehicle starts with no map data at all and slowly, cautiously explores the area and builds its own map – without a human being involved. This mode isn’t enabled on all vehicles, but it’s shown promise in experiments.

“Right now, we’re not giving the systems a lot of a priori data. Basically we’re letting them figure out their environments on their own, because…. we assume we might not have GPS,” Theisen said. “We could drop a robot in the middle of nowhere with no information [and it would] start building the map.”

The GVSC has even done experiments where one unmanned vehicle explores the area and shares its map data with others, he said, although that’s still very early research. But there’s long term potential to (for example) let loose a swarm of expendable scout robots in an urban area to carefully map the roads, alleys, even interiors and underground tunnels, before human troops have to enter these potential ambush zones.

Levels of Autonomy

Even when a vehicle is following a human-designated route over mapped terrain, Theisen told me, it still needs to be able to adapt to the unplanned. Faced with an obstacle blocking their path, the robots know how to back up and try another way around. If they get truly stuck – which happens — they can call a human to take over by remote control. And if they’ve lost their link to a human controller, depending on how much freedom you give the software, they can proceed on their mission or return to the last location it had a connection.

The exploratory/mapmaking mode is probably the closest to true autonomy the Army has developed. Most other modes require some level of human supervision:

The most basic mode is driver warning , which is found on many civilian cars today: A human is in the vehicle and in control, but the computer interprets data from sensors mounted around the vehicle and warns the human when they’re about to hit something. This is particularly helpful for driving cumbersome supply trucks, laden forklifts, and tracked vehicles around motor pools and supply depots, where traditionally a “ground guide” has to walk in front of the vehicle giving directions.

, which is found on many civilian cars today: A human is in the vehicle and in control, but the computer interprets data from sensors mounted around the vehicle and warns the human when they’re about to hit something. This is particularly helpful for driving cumbersome supply trucks, laden forklifts, and tracked vehicles around motor pools and supply depots, where traditionally a “ground guide” has to walk in front of the vehicle giving directions. The lowest level at which the computer can actually take control of the vehicle – temporarily – is driver assistance : When the sensors and collision-avoidance detect an imminent impact, instead of just warning the driver, they automatically hit the brakes. Again, this is a feature also found on many civilian vehicles, but it’s particularly attractive to the Army, which relies on sleep-deprived 18-year-olds to drive multi-ton vehicles full of fuel and explosives in the dark.

: When the sensors and collision-avoidance detect an imminent impact, instead of just warning the driver, they automatically hit the brakes. Again, this is a feature also found on many civilian vehicles, but it’s particularly attractive to the Army, which relies on sleep-deprived 18-year-olds to drive multi-ton vehicles full of fuel and explosives in the dark. The simplest form of unmanned operation, without a human aboard, is teleoperation . Essentially, this is remote control, where a human is controlling every aspect of the vehicle’s operation, but they’re not physically inside it. Instead, they’re receiving sensor data and sending commands over a wireless link. That link can be anything from line-of-sight radio, with a range of a few kilometers, Theisen told me, to a satellite link that lets operators at the GVSC in Detroit drive robots in Australia. Since looking at screens doesn’t give you as much sensory information as physically being in a vehicle, and network lag slows your reactions, teleoperation works best with driver warning and assistance enabled).

. Essentially, this is remote control, where a human is controlling every aspect of the vehicle’s operation, but they’re not physically inside it. Instead, they’re receiving sensor data and sending commands over a wireless link. That link can be anything from line-of-sight radio, with a range of a few kilometers, Theisen told me, to a satellite link that lets operators at the GVSC in Detroit drive robots in Australia. Since looking at screens doesn’t give you as much sensory information as physically being in a vehicle, and network lag slows your reactions, teleoperation works best with driver warning and assistance enabled). The vehicle becomes truly self-driving when it uses waypoint navigation . In this mode, a human being sets the destination, and may add intermediate points along the route as well. The computer figures out its own route from point to point to point, using much the same kind of map application you’d find on your smartphone. Then the vehicle drives itself along that route, avoiding pre-designated no-go zones, maneuvering around unexpected obstacles, and even trying alternative routes if needed. In the civilian world or on a military base, an early use for waypoint navigation might be a shuttle bus that travels the same path over and over, but even in a war zone, it could be used to send a truck down a secure supply route, send a damaged vehicle back to a repair point, or transport injured soldiers to an aid station when there are no able-bodied troops to spare to take them back.

. In this mode, a human being sets the destination, and may add intermediate points along the route as well. The computer figures out its own route from point to point to point, using much the same kind of map application you’d find on your smartphone. Then the vehicle drives itself along that route, avoiding pre-designated no-go zones, maneuvering around unexpected obstacles, and even trying alternative routes if needed. In the civilian world or on a military base, an early use for waypoint navigation might be a shuttle bus that travels the same path over and over, but even in a war zone, it could be used to send a truck down a secure supply route, send a damaged vehicle back to a repair point, or transport injured soldiers to an aid station when there are no able-bodied troops to spare to take them back. One of the most sophisticated modes is a hybrid of human and computer control called leader-follower. First, a human controller selects vehicles to form a convoy: The designated robots can maneuver out of their parking spaces and line up in a column or other formation on their own. Then a human gives orders to the lead vehicle – either by physically getting in it and driving, or teleoperating it by remote control, or even by giving it waypoints to follow autonomously – and the rest of the convoy follows. If the convoy runs into an obstacle, the human controller can order the whole formation to reverse, backing up precisely in their own tire tracks, or have them all do U-turns (autonomously) and switch his control to whatever vehicle is now in front.

Trucks & Tracks

So what can the Army make self-driving with this tech? Pretty much anything with wheels or tracks, it turns out, from the ubiquitous Humvee, to cargo handlers like forklifts, to the British Army’s German-made HX60 trucks, to the venerable M113, a Vietnam-era tracked armored transport now being used as an experimental surrogate for a future Robotic Combat System. (Human controllers follow the M113s in modified M2 Bradleys). All told, Theisen said, the GSCV has installed versions of its autonomy software and sensors on over 20 different types of vehicles.

Now, each of these installations requires some unique hardware connections to let the computer actually control the vehicle, Theisen said, which involves installing what’s known as drive-by-wire electronics instead of traditional mechanical and hydraulic controls. (One day drive-by-wire may come standard on cars and trucks, but for now no mass-produced vehicle has it).

The British Army lorries were particularly tricky, Theisen said, requiring some mechanical adaptations as well as electronic ones. “The steering wheel’s on the opposite side,” he noted, “[and] the UK trucks were made by MAN, which is a German company, so we ran into some new interesting braking systems we never really saw before, and some new interlocks with the transmission.”

Different kinds of vehicles and different missions also require different types of sensors installed at different locations to ensure 360 coverage: The M113, for instance, needs extra sensors because it’s so boxy. And not all vehicles have all modes enabled. The M113s can only do teleoperation or waypoint navigation right now, Theisen told me — but they use the same “navigation box” that’s installed on Oshkosh’s 10-wheeled Palletized Load System (PLS) heavy trucks.

“We run the same software at GVSC across all our programs,” he said. “The same types of sensors, the computers, algorithms.”

How can you get the same software to drive a 4×4 Humvee, a 10-wheeler, and a tracked M113? You need a different sensor to measure the speed of a wheeled vehicle than a tracked one, Theisen told me. You also need to tell the software how big the vehicle is – say, those boxy M113s — and how fast it can go. But the obstacle detection and navigation systems are the same.

Now, Theisen cautioned, just because the Army’s using the GVSC system on the M113s to try out unmanned tactics, that doesn’t mean the same system will go on the future Robotic Combat Vehicles when they enter service in the late 2020s. Even on the M113s today, he said, the navigation software and sensors are a totally separate system from the weapons controls, which require human approval to open fire.

The Army is already fielding supply trucks with leader-follower capability to operational units, however. Recently, 30 modified PLS trucks were issued to a transport unit at Fort Polk, La. A second unit at Fort Sill, Okla. will start getting its 30 trucks in January.

The soldiers are still driving the PLS trucks manually, with only the driver-assistance features enabled. But the Army’s working on official safety certifications to allow them to try out the leader-follower mode in January – with a human on each vehicle to take over, just in case – and then ramp up to fully unmanned operations by the end of 2020. No less a figure than then-Army Secretary Mark Esper, now Secretary of Defense, has praised the program as “critical.”

How hard is to learn how to operate these high-tech systems? The soldiers from Fort Sill spent about two weeks trying out the PLS trucks at Camp Grayling, Mich., where the Army’s also testing the Robotic Combat Vehicles, Theisen said. “It’s not a formal training process,” he said, “but we got the soldiers there Monday morning, and they did three hours’ worth of classes and three hours on the vehicle, training, and they were able to operate all these behaviors.”