Andrew Moseman

The autonomous car (and plane, and swarm of flying robots) is coming. That much is clear. But what might be less clear to onlookers watching the rise of the unmanned vehicle is that there are very different types of autonomy in machines, according to three PopMech Breakthrough Award winners.

This afternoon at PM's home in the Hearst Tower, our Senior News Editor Joe Pappalardo sat down with Joseph Chody, chief engineer for the X-47B unmanned combat air system; John Capp, director of global active safety electronics and innovation at General Motors; and Vijay Kumar, a University of Pennsylvania mechanical engineer who designs fleets of autonomous quadrotors that work together. Here are a few things we learned:

You won't get the Jetsons car just yet.

Capp is one of the members of the team behind Super Cruise, a Breakthrough Award-winning suite of features by GM including lane centering and collision avoidance. These are early steps, he says, but crucial ones. "It's a first generation of a future generation of autonomous systems. It's going to let people to take their hands off the road on certain freeways. It's going to let them take their feet off the pedals," he said.

But, he says, we're nowhere near the Jetsons' car experience, in which the occupant simply pushes a button and the car takes them to work. In reality, the self-driving car will come a few pieces at a time. Current Cadillacs already have a driver assist suite with radar, cameras, and ultrasonic sensors to monitor or "see" the area around the vehicle. In that sense, the rise of autonomous systems isn't that much different than the rise of car technologies such as anti-lock brakes. "They intervene and help you drive safely and you don't even know it," he said.

The human factor means different things to different industries.

Because self-driving or self-flying vehicles will come in stages, they'll have to learn to play nice with humans. ("It's always harder as an engineer when you have to deal with humans," Chody said.) And that means something different for each of our experts.

Chody's X-47B made history this year when the unmanned jet landed on an aircraft carrier. "We are integrating ourselves into the deck of an aircraft carrier, which is one of the most dangerous environments in the world." Chody says the X-47B and future generations of autonomous planes must be able to work within the structure of normal Navy operations. "[These] young men and women have to have a structure," he said. "It's not our place to tell them to do it differently."

Kumar's cooperative fleet of small flying bots could go into a hazardous area, such as a building where a biochemical spill has occurred, before first responders arrive. And he doesn't want the human operator to be worried about which bot takes the lead—he wants the person to simply give the bots a command and let them work out how they execute it. "We're really looking for superhuman performance," he says. "We want a paradigm where the human doesn't know which robots are responding to which request."

When it comes to cars, Capp says, the key is to have autonomous systems not replace humans (at least not in the short-term future), but rather to take over in those instances when humans really don't want to drive, such as in traffic jams or monotonous commutes. "The fun isn't going away," he said.

Many moral and legal issues are yet to be ironed out.

Kumar knows his UAVs creep out some people. "We just love our robots. I don't know why people think why they're scary. We made the James Bond video to dispel that myth that they're scary creatures."

He thinks part of the objection to drones is that the laws are not settled regarding integrating them into our airspace. But beyond that, there's a bigger, more philosophical question about how people feel about autonomous aircraft. Satellites take airborne pictures of our homes all the time, Kumar said. To many people, though, there's something more sinister about low-flying drones that might have cameras attached. "As a society we have to get used to that."

What about liability for a self-driving car—who's responsible when it crashes? Capp isn't too worried about it yet, saying that the legal system now in place for determining whether the driver or the car was responsible for an accident should still work. "If the computer's really doing everything, maybe then the law changes," he said. But expect some gray areas until that time. Capp says that until computers are ready to tackle all the challenges of the road, they might have to "tap the driver on the shoulder"—that is, warn the human occupant when a situation is too chaotic for its sensors to understand and ask the human to take over. What if the human driver doesn't?

This content is created and maintained by a third party, and imported onto this page to help users provide their email addresses. You may be able to find more information about this and similar content at piano.io