Until a self-driving Uber killed 49-year-old pedestrian Elaine Herzberg in March, autonomous vehicle tech felt like a pure success story. A hot, new space where engineers could shake the world with software, saving lives and banking piles of cash. But after the deadly crash, nagging doubts became questions asked out loud. How exactly do these self-driving things work? How safe are they? And who’s to guarantee that companies building them are being truthful?

Of course, the technology is hard to explain, much less pull off. That’s why employees with the necessary robotics experience are raking in huge paychecks, and also why there are no firm federal rules governing the self-driving car testing on public roads. This fall, the Department of Transportation restated its approach to AVs in updated federal guidelines, which amounts to: We won’t pick technology winners and losers, but we would like companies to submit lengthy brochures on their approaches to safety. Just five developers (Waymo, GM, Ford, Nvidia, and Nuro) have taken the feds up on the offer.

Into this vacuum has stepped another public-facing metric, one that’s easy to understand: how many miles the robots have driven. For the past few years, Waymo has regularly trumpeted significant odometer roll-overs, most recently hitting its 10 millionth mile on public roads. It’s done another 7 billion in simulation, where virtual car systems are run over and over again through situations captured on real streets, and slightly varied iterations of those situations (that’s called fuzzing). Internal Uber documents uncovered by the New York Times suggest the ride-hailing company tracked its own self-driving efforts via miles traveled. It’s not just companies, either: Media outlets (like this one!) have used miles tested as a stand-in for AV dominance.

If practice makes perfect, the more practice your robot has, the closer it must be to perfect, right? Nope.

LEARN MORE The WIRED Guide to Self-Driving Cars

“Miles traveled standing alone is not a particularly insightful measure if you don't understand what the context of those miles were,” says Noah Zych, the head of system safety at the Uber Advanced Technologies Group. “You need to know, ‘What situations was the vehicle encountering? What were the situations that the vehicle was expected to be able to handle? What was the objective of the testing in those areas? Was it to collect data? Was it to prove that the system was able to handle those scenarios? Or was it to just run a number up?”

Think about a driver's license exam: You don't just drive around for a few miles and get a certificate if you don’t crash. The examiner puts you through your paces: left turns across traffic, parallel parking, perfectly executed stop sign halts. And to live up to their promises, AVs have to be much, much better than the humans who pass those tests—and kill more than a million people every year.

Waymo, which has driven more miles than anyone and plans to launch a commercial autonomous ride-hailing service this year, says it agrees. “It’s not just about racking up number of miles, but the quality and challenges presented within those miles that make them valuable,” says spokesperson Liz Markman. She says Waymo also keeps a firm eye on how many miles it’s driving in simulation.

Another safety benchmark used in media coverage and policy discussions of AVs are “disengagements”—that is, when a car comes out of autonomous mode. In California, companies must note and eventually report every instance of disengagement. (They are also required to file an accident report for every crash incident, be it a fender-bender, rear-end, or being slapped by a pedestrian.) Developers say disengagements are an even crappier way to measure safety than checking the odometer.