The robot car revolution hit a speed bump on Tuesday as senators and tech experts sounded stern warnings about the potentially fatal risks of self-driving cars. “There is no question that someone is going to die in this technology,” said Duke University roboticist Missy Cummings in testimony before the US Senate committee on commerce, science and transportation. “The question is when and what can we do to minimize that.”

Automotive executives and lawmakers sniped at each other over whether universal standards were necessary for self-driving cars, with private sector saying that standards would slow progress and legislators replying that they’d heard the same objections over updated seatbelt standards in 1998.

Killer robots and digital doctors: how can we protect society from AI? Read more

Senators Ed Markey and Richard Blumenthal, who have cosponsored legislation that proposes minimum testing standards for automated drivers, told equivocating industry representatives to fall in line.

“If I asked somebody: ‘Do you think that that red light means stop?’ and they came back to me and said: ‘We have great respect for stoplights,’ I would say, ‘The answer is ‘yes’,” Blumenthal told General Motors’ Michael Ableson. “The credibility of this technology is exceedingly fragile if people can’t trust standards – not necessarily for you, but for all the other actors that may come into this space at this point.”



Markey, in conversation with Delphi Automotive’s Glen DeVos, cut the executive off when he tried to answer a question about whether the industry would support a minimum legal standard by pointing to his company’s own high standards.

“I know you do, but not all the bad companies do,” said Markey, smiling. “We don’t pass murder statutes for our mothers. We do it for all the people who might commit murders.”



The standards are already becoming morally complex. Google X’s Chris Urmson, the company’s director of self-driving cars, said the company was trying to work through some difficult problems. Where to turn – toward the child playing in the road or over the side of the overpass?

Google has come up with its own Laws of Robotics for cars: “We try to say, ‘Let’s try hardest to avoid vulnerable road users, and beyond that try hardest to avoid other vehicles, and then beyond that try to avoid things that that don’t move in the world,’ and then to be transparent with the user that that’s the way it works,” Urmson said.

Cummings said the industry was by no means ready to start road-testing cars on public byways and that the rush to market would be very dangerous, in part because of technical limitations but also because of malice. “[W]e know that people, including bicyclists, pedestrians and other drivers, could and will attempt to game self-driving cars, in effect trying to elicit or prevent various behaviors in attempts to get ahead of the cars or simply to have fun,” she said.

Your next car will be hacked. Will autonomous vehicles be worth it? Read more

The roboticist also decried what she characterized as the industry’s attempt to substitute public demonstrations for rigorous testing without making its own testing protocols available for public scrutiny.

“We know that many of the sensors on self-driving cars are not reliable in good weather, in urban canyons, or places where the map databases are out of date,” said Cummings. “We know gesture recognition is a serious problem, especially in real world settings. We know humans will get in the back seat while they think their cars are on ‘autopilot’. We know people will try to hack into these systems.”