Whether it’s the person sitting in the vehicle or the one who designed it, where responsibility lies when a robot car commits a traffic offence is not always clear

When a California cop pulled over a Google self-driving car for holding up traffic this week, he knew he couldn’t send its robot driver to jail. But exactly where the responsibility lies for traffic problems caused by autonomous vehicles is not always so clear.

Self-driving cars are the future, but there’s a tech traffic jam in their path Read more

For minor offences like speeding or parking tickets, the person sitting behind the driver’s wheel is almost certainly going to face the music, says Bryant Walker Smith, an assistant professor in the School of Law at the University of South Carolina and expert on self-driving car law: “Under existing law, someone who is most immediately and obviously the operator of the vehicle would likely be treated as the driver.”

The US states that have explicitly regulated autonomous vehicles so far – California, Nevada and Michigan – all require a responsible human safety driver ready and able to take over immediately if something should go wrong. But that is not the case in Texas, where Google’s prototype self-driving vehicles have also taken to the road.

If Google decided to start picking up random people on the street in Austin and giving them rides (there’s no evidence that this is happening), those passengers probably would not have any legal liability for everyday traffic tickets. “In that case, I’d expect some enterprising manager at the state highway patrol to send the ticket direct to Google,” says Smith.

The legal situation would be murkier if a self-driving car were, for example, to veer wildly into oncoming traffic and cause a serious or fatal accident. Criminal charges like vehicular manslaughter require negligence or malice on the part of the driver. “If the person in the driver’s seat was doing everything right, monitoring and observing properly, they would not have the mental culpability for manslaughter,” says Smith.

The maker of the robo-car itself would also be unlikely to face criminal charges, thinks Smith. In the US, corporations are rarely accused in criminal courts. In the most famous example, where Ford was charged in 1979 with criminal homicide for not fixing a known problem with its Pinto vehicle that resulted in dozens of deaths, the company was acquitted.

But that doesn’t mean a deadly self-driving car would get off scot free. The driver and the owner of the car would almost certainly face huge lawsuits and, says Smith, “in the case of these autonomous vehicles, most certainly the developer or manufacturer could be liable”. Ford ended up paying tens of millions of dollars in damages, and recalling the Pinto.

So far, however, self-driving cars have shown themselves to be exemplary drivers. Google’s have travelled 1.2m miles without so much as a traffic ticket, although they do seem prone to fender-benders. In response to yesterday’s traffic stop, Google wrote: “We’ve capped the speed of our prototype vehicles at 25mph for safety reasons. We want them to feel friendly and approachable, rather than zooming scarily through neighborhood streets.”

That probably isn’t true. Before its koala-faced prototypes, Google operated a fleet of self-driving modified Prius and Lexus vehicles at higher speeds around Mountain View and farther afield, including on motorways.

In fact, the 25mph speed cap is probably due to Google wanting to test its prototype on public roads without undergoing the expensive and time-consuming crash and safety tests required of most new cars. With a 25mph limit and an all-electric powertrain, the prototype qualifies as a neighborhood electric vehicle (NEV), a category designed to allow golf carts and low-emission runabouts on public roads. California law does, however, limit such cars to roads with speed limits of 35mph and below, to avoid just the kind of tailback it caused yesterday.