As we all know, making cars is hard, to say nothing of making cars that drive themselves, which is exponentially harder. I’d like to think we at Jalopnik have done an OK job illustrating for you, our dear readers, just how hard making autonomous cars has proven, but just in case we haven’t, here’s another go at it.




The Washington Post made a swell interactive about how autonomous cars work and what can cause them to, well, not work. Using AV disengagement reports from the California DMV as well as “voluntary reports from AutoX, GM Cruise, Nvidia, Uber, Waymo, Nuro and other self-driving vehicle companies that describe their autonomous technology and safety standards,” the Post put together a little interactive game about what trips up AVs and why it requires so much human intervention.


It’s a lot of fun. It shows the limitations AVs often have currently:

And puts the onus on you, a human with eyes and a brain, to potentially avert disaster.


Among the examples are a school bus blocking the sensors, mistaking pedestrians for cyclists and vice versa, or needing to pull over due to an incoming thunderstorm.



Overall, the game is a neat little reminder that AVs remain a very difficult problem to solve, and their inability to make complex inferences from their surroundings is turning out to be a much bigger handicap for effective driving than many technologists previously thought.




Three to five years ago, many AV boosters thought we’d have self-driving cars available for widespread use by now. Instead, we still have about 1,500 on the road undergoing testing. Most experts still think we’re still decades away from commercially available full autonomy . In the immortal words of Elon Musk, humans are underrated.



Give it a try here.