The Department of Transportation says that it plans to issue a series of rules for self-driving cars that will potentially preempt state laws and regulations. This comes after lobbying by Google, which was disappointed when a state law that Google had supported led the California Department of Transportation to issue rules that forbade the use of cars that didn’t allow human drivers to override. Since Google was planning cars that didn’t have steering wheels and other controls that drivers could use, the state rule conflicted with Google’s goals.

The Antiplanner has urged against federal regulation, fearing that the feds would be as likely to get it wrong as the states, whereas if the states were left to regulate, at least a few states would get it right and the others would emulate their examples. Federal regulation wouldn’t be bad if the rules were perfect, but how likely is that?

For a more detailed free-marketeer’s view of the Department of Txansportation’s proposal, see Marc Scribner’s analysis. Here, I want to focus on one thing: the debate over fully autonomous vs. semi-autonomous vehicles.

The projected federal rules appear to strongly encourage Google’s notion of cars that completely take over driving from humans. However, they could discourage semi-autonomous cars that assist drivers without closing the option of human drivers. According to the White House, the rules will require recall of any semi-autonomous cars that fail to “adequately account for the possibility that a distracted or inattentive driver-occupant might fail to retake control of the vehicle in a safety-critical situation”–in other words, the Tesla accident.

The administration is thus taking sides in a debate that has been going on for awhile in the self-driving research community. On one side are those who say that semi-autonomous vehicles are a necessary stepping stone towards fully autonomous vehicles both because they will help manufacturers solve all the problems of self-driving cars and help travelers get used to the idea of letting a computer take over the driving. On the other side are those who say that semi-autonomous vehicles will be dangerous because they can’t possibly alert people to situations that the computer can’t handle fast enough for human drivers, who may be distracted by other things, to take over.

While both sides have valid arguments, most agree that even semi-autonomous vehicles could be safer than human-driven cars. Even if there is a Tesla-like accident now and then, the total number of accidents and fatalities will be lower. (To be fair, Tesla’s car wasn’t semi-autonomous; it merely had some driver-assist features that were not supposed to allow the driver to get completely distracted from driving. Tesla says it is issuing a software upgrade any day now.)

The safety part of the debate might be moot if Ford can keep its promise to have fully autonomous cars on the road by 2021, only a year after other manufacturers promised semi-autonomous cars. But there is still an open question of whether Americans will readily buy fully autonomous cars that have no steering wheels or other controls without having some experience with the semi-autonomous kind.

I suspect Ford’s fully autonomous cars will only work on certain roads that have been thoroughly mapped and entered into the vehicle’s software. Without human controls, those cars won’t be able to leave the mapped areas. This means the main market for such cars will be Uber-like companies that want to offer driverless taxi services.

The bottom line is that the administration seems eager to promote fully autonomous vehicles, but if they discourage the semi-autonomous intermediate step, they might actually delay the adoption of the fully autonomous sort. We’ll know more when the department releases the proposed rules themselves. Until then, never fear: even if self-driving cars are delayed, Walmart is planning self-driving shopping carts.