The brave new world of self-driving cars means addressing some thorny issues—such as, should people be allowed to ride in autonomous cars after they've had a few drinks?

An Australian government advisory board just released a report on how regulations may need to be updated for self-driving cars, including how drunk-driving laws may need to be changed. The National Transport Commission (NTC) wants to develop a full set of regulations covering all aspects of autonomous-car operation by 2020.

Letting a drunk person ride in a fully-autonomous car doesn't seem like much of an issue, but under the current rules just entering a vehicle, even a self-driving one, could lead to drunk-driving penalties. In some Australian states, just starting a car with the intent to put it in motion while intoxicated is a punishable offense. The rules may need to be rewritten to allow people to start a fully autonomous car, the NTC argues.

The government should want people to take advantage of fully-autonomous cars instead of endangering themselves and others by getting behind the wheel after drinking, the NTC notes. But people may be reticent about using self-driving cars if they can still be penalized for drunk "driving."

Another issue is liability. It's expected that the majority of self-driving cars will be used by ride-sharing services, and the operators of those services could be made liable in the event of a crash. But the picture is less clear for privately-owned self-driving cars. If one of these cars crashes and the person who owns it wasn't driving, are they responsible?

The liability question is probably one of the biggest regulatory issues surrounding self-driving cars in any country. Without a human driver, it becomes less clear who is to blame when an autonomous car crashes.

It's also worth noting that not every car will be fully autonomous. It's one thing to let an intoxicated person get into a car capable of driving itself 100 percent of the time. But cars with lower levels of autonomy, such as SAE Levels 3 and 4, will require human drivers to take over in certain situations. That needs to be made clear to both the public and regulators in order to avoid some potentially dangerous ambiguity.