The good news is that more firms are making serious investments in “safety culture”—a concept rooted in the idea that humans, like machines, can be organized optimally to create any kind of outcome. That could be profit or efficiency. Or it could be keeping people alive.

“‘Safety culture’ is when you don’t let things slide,” says Philip Koopman, who studies safety in autonomy at Carnegie Mellon University. “If your self-driving system does something unexpected, just one time, you drill down and you don’t stop until you figure out why, and how to stop it happening again.” This sounds simple, but tracking down every last little, sometimes inconsequential bug takes a heap of time and money. You can find this sort of safety culture at work in factories, the oil industry, and hospitals. But the best example—the one especially relevant to a human-toting technology—comes from the sky.

Since the late 1960s, the American airline industry has cut its fatality rate in half. Until an engine blew on a Southwest flight this spring, killing a woman, no one had died on an American commercial jet in eight years. The impressive record has a few explanations, ones that can be replicated. For one, internal auditors oversee many elements of aircraft construction and programming to ensure a particular level of safety. For two, the industry makes great use of checklists—a way to ensure that everyone is paying attention and staying on task. And for three, airlines and aircraft designers don’t compete on safety. They share knowledge. In the US, a secure third party contractor facilitates data sharing between airlines and aircraft designers on everyone's mistakes. If a plane crashes, the entire sector is going to find out why, and get the information it needs to know before it happens to them.

LEARN MORE The WIRED Guide to Self-Driving Cars

To deliver on their talk about this sort of safety culture, self-driving companies have turned to a classic Silicon Valley trick: poaching safety talent. Waymo hired a former chair of the National Transportation Safety Board. Starsky Robotics hired its first head of safety in the spring, a drone and aviation industry vet with experience with the Federal Aviation Administration. Uber signed on another former NTSB chair, a man with a background in aviation, to advise the company on its “safety culture.” And Zoox has Rosekind, who headed up the National Highway Traffic Safety Administration and has three decades of experience in human fatigue and transportation. And there's another aviation veteran: Gonzalo Rey, its vice president of system design, most recently managed 1,200 workers as engineering lead and then CTO for aerospace company Moog.

Meanwhile, companies and observers are floating more ideas about maintaining safety—which are also bids to keep the industry in the public’s good graces. The companies building AVs could all agree to use the sort of vehicle engineering safety standards that have already been devised for electronics in the automotive industry, with some tweaks for self-driving. (The standard, called ISO 26262, establishes a framework for building and documenting software safety systems.) There also have been rumblings about creating some sort of platform that would allow self-driving developers to share data and learnings, like they do in aviation.

And there's talk about improving the information that some companies have shared, documents called “voluntary safety self-assessments”. The federal government politely requested companies start putting out these letters last fall, with details on how their engineers approach safety. (As of this week, seven companies have published VSSAs.) But thus far, they’ve been criticized as glossy brochures, better for marketing than informing. (Seltz-Axmacher calls them “marketing copy.”)

As for some sort of test that a company might pass in the future? At Zoox, they like the idea of asking autonomous cars to do exactly what normal ones can’t: show that their sensor and computer setups can eliminate 94 percent of crashes—the share, according to NHTSA, that are due to human error. “Everybody's kind of waving their hands trying to make new things up, but nobody's actually showing how they might apply their systems to get the level of safety right,” says Rosekind, still animated. Self-driving is new, and will require new ways of thinking about transportation, as well as labor and public space. Safety, on the other hand, doesn't have to come from scratch.

More Great WIRED Stories