The federal government says that when it comes to regulating self-driving cars, computers and software systems can be considered the "driver" of the vehicle. It's a major moment in the effort to introduce autonomous driving to America's roads, which is hampered as much by regulatory questions as by technological hurdles. But really, it's just a starting point for figuring out how to update the arcane labyrinth of rules that govern how our cars work now.

The news came in a letter from the branch of the Department of Transportation that handles car regulations, the National Highway Traffic Safety Administration (NHTSA), to Google, which had asked for clarification on rules that pertain to the self-driving car it's developing, which lacks a steering wheel and pedals. "NHTSA will interpret 'driver' in the context of Google's described motor vehicle design as referring to the SDS [self-driving system], and not to any of the vehicle occupants," the feds' letter to Google says. "If no human occupant of the vehicle can actually drive the vehicle, it is more reasonable to identify the 'driver' as whatever (as opposed to whoever) is doing the driving."

This is undeniably a big moment in the fight to take the human out of the driver's seat—or in Google's case, to obliterate fully the very notion of a driver's seat. But it immediately raises questions that are tougher to answer.

This letter will one day be in a museum of technology history. Bryant Walker Smith

"This letter will one day be in a museum of technology history—or at least legal history," says Bryant Walker Smith, an assistant professor at the University of South Carolina School of Law and affiliate scholar at the Center for Internet and Society, who studies self-driving vehicles. But "where technology meets law, the devil is in the details."

And boy, are there a lot of details. Title 49, Subtitle B, Chapter V, Part 571 of the Electronic Code of Federal Regulations, “Federal Motor Vehicle Safety Standards,” is a thousand-page monster that lays out, with excruciating precision, the standards manufacturers must follow for any passenger car (or bus, or motorcycle) they intend to sell. Those rules are "arguably anachronistic," Walker Smith says, but that doesn't mean they're easy to update.

Defining the car's operating system as its driver brings up two big problems. The first is that many of NHTSA's regulations explicitly refer to human anatomy. The rule regarding the car's braking system, for example, says it "shall be activated by means of a foot control." The rules around headlights and turn signals refer to hands. NHTSA can easily change how it interprets those rules, but there's no reasonable way to define Google's software—capable as it is—as having body parts.

All of which means, the feds "would need to commence a rulemaking to consider how FMVSS No. 135 [the rule governing braking] might be amended in response to 'changed circumstances,'" the letter says. Getting an exemption to one of these rules is a long and difficult process, Walker Smith says. But "the regular rulemaking process is even more onerous."

The second new problem is that the feds don't have the tools they need to test new systems. The rule concerning rear visibility, for example, requires that the vehicle "display a rearview image (of a specified area of certain dimensions behind the vehicle) to the vehicle operator." NHSTA's happy to say that the operator is in fact a pile of software, and there's no talk of eyeballs here.

The problem is that the feds don't have an established way to verify that that image is making its way to the inhuman operator. It "would be unable to conduct confirmatory testing," the letter says. Even if it wants to certify Google's car for use by the public, it would have to first create a new process for making sure the vehicle works as mandated.

The feds are being surprisingly flexible in how they handle the emergence of autonomous driving.

The good news in both cases is that the feds are being surprisingly flexible in how they handle the emergence of autonomous driving. NHTSA's letter to Google repeatedly says the tech giant may want to ask for exemptions to its rules (though it also says, in footnotes, that Google may "wish to reconsider its view" on eliminating human-operated controls). Last month, Secretary of Transportation Anthony Foxx said NHTSA will be more willing to grant those exemptions, including ones that completely remove the role of the human. Foxx also announced he's told his colleagues at the DOT that they have six months to draft comprehensive rules governing how autonomous cars should be tested and regulated. (It's a quick timeframe, likely linked to the fact that Foxx is expected to leave the office when Obama's second term ends.)

That's not to say this will all be settled so easily. Those rules from the DOT will really be model legislation. The feds regulate how cars are made, but it's the states that get to decide how they behave, via traffic laws. Then you still have to figure out how to decide who's liable in case of an accident, which is a whole other issue.

So yeah, this is gonna take a while. But the fact that the feds are so open to the idea—and that they're willing to count computers as humans—is a big step forward.