Research shows driverless cars could eliminate more than 90% of the 35,000 road deaths in America each year. That said, the Florida accident poses the two biggest questions about autonomous vehicles, as driverless cars are known.

● How will the public react to deaths in driverless cars as opposed to road deaths caused by humans, even if there are far fewer of the former? Put another way, will emotions outweigh statistics?

● What’s the best road to automotive autonomy, “L3” vehicles that can mostly drive themselves, or fully driverless “L4” cars in which steering wheels, brake pedals and other human controls are superfluous?

Tesla expressed sorrow at Brown’s death, but defended its technology. “The data is unequivocal that Autopilot reduces driver workload,” it declared in a corporate blog after the accident, “and results in a statistically significant improvement in safety when compared to purely manual driving.” Not in this case.

On May 7, Brown, a former Navy SEAL from Ohio, was entering an intersection with his car in Autopilot mode when a tractor-trailer turned left across his path. Neither the Autopilot system nor Brown detected “the white side of the tractor trailer against a brightly lit sky,” the Tesla blog explained.

Brown’s car passed under the trailer, the bottom of which smashed the windshield and killed Brown while the truck’s driver barely noticed anything. Reports quickly appeared that Brown was watching a Harry Potter movie on a DVD player while in the car -- an act of startling irresponsibility if true. The investigation continues.

The accident shows automakers need to slow down their push towards automotive autonomy, Mary “Missy” Cummings, head of Duke University’s Humans and Robotics Laboratory, told The Detroit News. She had delivered that same message to a U.S. Senate Committee hearing last March.

A similar view came from Harald Kruger, chief executive officer of BMW. “Today (driverless) technologies are not ready for serious production,” he said, according to The New York Times. BMW’s first driverless system won’t be deployed until 2021, Kruger added. The Los Angeles Times, in an editorial, opined: “It’s time to tap the brakes on self-driving cars.”

Consumer Reports, with a circulation of 8 million, has called for Tesla to stop calling the system Autopilot and disable it while it updates the program to require drivers to keep their hands on the wheel. In response, Tesla said it's constantly improving AutoPilot so drivers are safer with it than without it. A U.S. Senate committee has asked Tesla to testify about its system.

Autopilot lets drivers switch in and out of driverless mode with the flick of a switch. It is “still in a public beta phase,” Tesla explained after the accident, though the name might be taken to imply more than that. The company added that drivers are warned in the acknowledgement box, which must be checked before activating the software, that Autopilot “ ‘is an assist feature that requires you to keep your hands on the steering wheel at all times.’ “

In the video below Tesla explains Autopilot and how it works.

Lawyers will decide what that warning means for Tesla’s legal liability, if any. Meanwhile, the accident fuels the “L3 versus L4” debate among driverless-car experts.

The American government classifies just about everything -- strength of hurricanes, grades of frozen carrots -- and driverless cars are no exception. L0 cars, the lowest level, have no self-driving features. Levels 1 and 2 add cruise control and advanced versions that keep a safe following distance. L3 cars can mostly drive themselves, but require human drivers to take over in emergencies. L4 vehicles are fully autonomous. They don’t need steering wheels, brake pedals or any human controls.

Tesla’s Autopilot isn’t a full L3 system, but it’s as close as anything on the road. Mercedes-Benz, Audi and BMW -- Kruger’s “not ready” statement notwithstanding -- also have systems that allow partial self driving, for example at low speeds in traffic-jams.

The L3 argument is that L4 autonomy isn’t fully reliable, so driver intervention is necessary in emergencies. It’s certainly the view of Tesla and most other automakers.

L4 advocates retort it isn’t realistic to expect human drivers to be instantly responsive in emergencies after long periods of inactivity or inattentiveness, which L3 systems encourage. Some YouTube clips show people pushing Autopilot too far or having, well, adjustment issues (watch frightened Granny in Tesla on Autopilot.)

Even if drivers behave responsibly, L4 advocates argue, humans simply can’t re-engage quickly.

Google is “definitely an L4 company,” one executive told me. That’s also the position of some car companies, including Volvo and Ford, whose global research vice president is a former senior aerospace executive. If professional airline pilots can’t instantly re-engage with their aircraft in an emergency, he explained when I interviewed him, how can ordinary drivers?

The L3 answer to that is DSS, or Driver State Sensing: sensors inside the car to detect when the driver’s attention is waning and activate a buzzer or bell. But being spied on inside your own car might strike people as freaky. Could DSS really mean Driver Spy System?

L3 v. L4 isn’t the entire debate. Toyota espouses another alternative it calls “Guardian Angel.”. People would drive, but the car would take control when sensors indicate an accident is imminent.

Raising another issue, Stanford University legal fellow Jerry Kaplan wrote in The Wall Street Journal that driverless cars need roads designated only for them. “As with the ‘horseless carriages’ of the early 1900s, which at first were merely added to the jumble of pedestrians and carts swarming through the streets,” Kaplan wrote, “the real benefits of the new technology won’t be realized until we see substantial changes in our transportation infrastructure.”

Federal officials must decide among these approaches, not all of which are mutually exclusive. The technology is moving faster than the regulators.