Why We Should Be Even More Nervous About AVs Most Americans worry about sharing the road with still-untested autonomous cars — but even perfecting the tech may not make AVs safe enough.

Coord invites cities and other curb managers to apply for its Digital Curb Challenge to undertake a free curb management pilot program in 2020. Applications close on Feb. 14. Click here for information.

A staggering 85 percent of Americans aren’t comfortable with the idea of a driverless car pulling next to them at a light or passing them in a bike lane — but 68 percent of them also think better vehicle-safety standards will make autonomous vehicles safe enough for our streets, a new survey shows.

Advocates aren’t so sure about that — and they’re trying to convince lawmakers that we can’t afford to be hands-off about the hands-free-driving revolution.

The survey was released by the Advocates for Highway and Auto Safety as part of an effort to get Congress to impose tighter regulations on the budding autonomous-car industry. The group’s president, Cathy Chase, reported its findings to a congressional committee hearing on Tuesday.

The survey didn’t ask why, exactly, Americans are anxious about AVs, but its results suggests some possibilities.

Questionable safety exemptions



AV makers have gotten exemptions on some federal vehicle-safety standards imposed on regular cars; 63 percent of survey respondents said they didn’t want to see those exemptions expanded any further.

For example, driverless delivery vehicle maker Nuro was recently granted a temporary exemption to deploy a car with a hard plastic face instead of a windshield made out of a material like Safelite glass, which crumples on impact; advocates argue that the company hasn’t adequately tested whether or not the design would be more lethal for a pedestrian in the event of a crash.

AV manufacturers also aren’t federally required to include event-data recorders in autonomous vehicles, as Jeffrey Tumlin, director of transportation for San Francisco’s Municipal Transportation Agency, told the panel. The “black boxes” record information from vehicle sensors immediately before a crash, and they’re almost standard on non-autonomous cars.

Untested tech — and unwitting guinea pigs

The technology that drives AVs remains substantially untested, even as 29 states have allowed autonomous vehicles on the road. Of those 29, 12 states have permitted the full deployment of driverless cars without a human being in the “driver’s” seat as a failsafe, and another 8 have permitted on-road tests.

“Driverless cars are being developed and tested on public roads without sufficient safeguards to protect both those within the AVs and everyone sharing the roadways with them, and without express consent,” Chase said at the hearing. “[Our organization] is very concerned that an artificial rush to pass legislation, fueled by AV manufacturers wanting to be the first to market and recoup their substantial investments, already surpassing $100 billion, could significantly undermine safety as well as public acceptance and the ultimate success of these vehicles.”

The top Republican on the hearing panel, Cathy McMorris Rodgers, argued that pushing untested AVs onto the road was simply the price Americans must pay for progress — and that, if we didn’t, the industry might lose market share to China, which is aggressively testing driverless cars as we speak. But others said that a “space race” on AV development wasn’t worth it — especially if it costs human lives.

“Congress has been so excited about helping to deploy new automated vehicle technology that they have forgotten that rushing AVs to market without building public trust and support is more likely to jeopardize the viability of this industry,” said Transportation for America President Beth Osborne.

No vision tests for a ‘driver’ without eyes

One simple test that AVs have yet to pass is the robotic-car equivalent of the vision test that’s mandatory for licensing human drivers.

Automated cars don’t need to be able to “see” the road so much as they need to be able to detect, recognize and respond appropriately to objects and people in their path, under all possible road conditions. Yet it takes a long time to teach a computer how to recognize even the most common roadway hazards, much less the infinity of possible objects that might drift into a car’s path. Uber famously deployed a driverless car in Arizona that couldn’t recognize the shape of a pedestrian unless he or she appeared in a crosswalk; it struck and killed a woman, Elaine Herzberg, who attempted to cross mid-block because the closest intersection was over 360 feet away.

Americans seem to think that pedestrian-detection systems should be held to the same standard as human beings; 76 percent of the survey’s respondents agreed that an AV should be able to pass the machine equivalent of the vision tests that drivers take.

But even if an AV did ace such a test, that wouldn’t necessarily mean it’d be safe on the road — because a computer will simply never be able to see the world as clearly as a human being, some advocates argue.

“AVs will have their own issues, because they’re stupid machines that can only react and branch off decision trees based on how they’re programmed, and the chaos and uncertainty of the real world can throw any number of baffling situations to them that a human wouldn’t even worry about for a moment,” Jason Torchinsky, author of a book on autonomous vehicles, wrote recently in Jalopnik. “Dirty sensors, odd reflected light, confusing billboards, clouds of smoke or dust, unpredictable animal or human behaviors, a bunch of paper blowing in the wind, any of these things can completely lock up even the most advanced AV humans have built.”

Autonomous cars are still cars

Even if AVs somehow gained flawless pedestrian-detecting technology, fast-reacting brake systems and crumpling windshields, they still wouldn’t be good for road safety.

AVs are still 2,500 pound steel machines that can still kill or maim human beings, especially at high speeds. Even if an AV was so perfect it would only hit someone who sprinted into its path, making it physically impossible for the car to stop in time, that’s still not good enough. Moving through public space in a manner that a motorized, 2,500 pound computer can’t predict should not be crime punishable by death.

If our goal is to end pedestrian fatalities, our answer cannot be more cars with better — or even perfect — safety features. We should be looking to undo the dependency on automobiles that got us into this mess in the first place. Which will take a whole lot more than a robot on wheels.