Picture a globe. Which I'm now going to spin clockwise with my hand. It blurs for a few seconds—North America, Asia, North America, Asia—and then I stop it, on the morning we're all heading back to our jobs. For me, it'll start by hopping on the 405 freeway, and it'll be all smiles merging into that wall of creeping cars again. I miss you, brake lights.

On day two, though, the smile will have collapsed into a pinched line. I react by punching dashboard buttons to turn on Autopilot or ProPilot Assist or Super Cruise if I have them, to soften the abrupt spike in my traffic stress. As will a lot of other people, sprinkled all over the road around me. Some of them, bad actors.

YouTube is rife with misbehavior in Teslas: the guy asleep at the wheel, the "driver" lounging in the back seat, the guy and gal in, um, the front seat (to which Musk tweeted, "Turns out there's more ways to use Autopilot than we imagined").

Hardy-har-har. Except not.

See all 3 photos

Before the Great Calamity started, the Senate Commerce Committee grilled NHTSA on Autopilot abuse, and it got so heated that Senator Ed Markey of Massachusetts demanded that it be deactivated until Tesla figured out how to stop people from tricking its steering with water bottles and oranges. (Yes, eye-tracking would help, but aimed eyes don't mean a focused mind.) In its investigation of Apple engineer Walter Huang's tragic Model X crash, the National Transportation Safety Board scolded Tesla, NHTSA, and even CalTrans for not promptly repairing a previously damaged crash barrier.

On the other hand, in one of Lex Fridman's MIT YouTube interviews, there was an interesting response from guest Sebastian Thrun (an expert's expert: victorious in the first DARPA Grand Challenge, founder of GoogleX, Google's self-driving program; he's now with Waymo and currently leads flying-car company Kitty Hawk). Thrun said, "I literally use AutoPilot every day, and it's kept me safe … specifically for highway driving when I'm slightly tired."

These are two colliding galaxies: On one side, its abusers on YouTube, angry senators, and a riled NTSB, and on the other, technology experts and devotees who sing its praises. But the problem isn't the system. It's the behavior of the people behind the wheel. Us.

In 2018, 36,560 Americans died on our roads, and autonomous technology could save more of them than seat belts and air bags combined. Most of the players developing the new tech are using massive simulation and some private track-testing, sprinkled with real-world data with trained safety drivers to capture the freakier edge cases.

In run'n' gun, Elon Musk style, Autopilot is instead boot-strapping itself up from Level 2 to near-autonomy by recognizing and uploading things it gets wrong on real highways (via driver interventions or reports), then uses machine learning to improve its algorithms, which then get sent via over-the-air to update its whole fleet's behavior. As Fridman points out, this is by far the greatest meeting of humans and artificial intelligence on earth.

See all 3 photos

A couple years ago, I was a member of a small squad of journalists who were safety drivers behind the wheel of an autonomous Audi A7 that traveled from Silicon Valley to Las Vegas. Before we got into the car, we had to do a long driver-training day at a Volkswagen proving ground (mainly to prove we could regain control if things went crazy), whereupon we were phone-interviewed by the California DMV to make sure we were stable individuals before being let loose on public roads. We were issued California autonomous driver's licenses and took all this very seriously. Folks in Teslas are now experiencing the same autonomous driving I did, but without any of that preparation or soberness.

My proposal: Before Autopilot can be used, a Model 3's occupant-facing cameras should identify who's behind the wheel. The car's screen should deliver realistic expectations of what it can do, warnings about situations it might mess up, and reminders to the driver of the good they can do today.