There is little doubt that the technology behind driverless cars is nearly advanced enough for mainstream use. Google plans to make its biggest public display yet of its cars on Tuesday, when it takes reporters on spins around Mountain View, Calif. Carmakers like BMW and Toyota are also preparing to sell cars that drive themselves.

Instead, the bigger question about driverless cars is a legal one. Who is responsible when something goes wrong?

Driverless cars are supposed to be much safer than cars driven by people because they don’t make human errors. But accidents seem inevitable. What happens when a driverless car kills someone? Or less drastically, who pays the ticket when it doesn’t notice a no-parking sign, or when an error in Google Maps sends it the wrong way down a one-way street?

As robots become mainstream, lawmakers will have to grapple with how to govern machines and hold software accountable. Only four states and the District of Columbia have passed laws specific to driverless cars, some just allowing manufacturers to test cars and none answering every legal question that might come up.