Before Joshua Brown was killed in his Tesla Model S while alledgedly watching a movie on Autopilot, I had a conversation with my friend, Comms. “The first person to kill someone in a Tesla on Autopilot,” Comms said, “is going to be responsible for 340,000 deaths.” Comms is an old friend working in communications for a major automotive manufacturer. He’d just spent an hour failing to convince me Elon Musk was the modern Preston Tucker, but I couldn’t argue with his newest line of reasoning. “Nonsense,” I said. “It’s great. I know its limitations.” But he was right. I did almost kill 340,000 people the last time I drove a Tesla on Autopilot. It was amazing how close I came. There they were, lined up on both shoulders of the Interstate like luminous bowling pins waiting to be mowed down. I remembered how well Autopilot worked, and wanting to close my eyes, or watch a movie, or open my laptop and answer emails. “It doesn’t matter,” said Comms. “If you don’t kill someone. Someone else will.” “Maybe.” “Definitely. Every time you try to set a Cannonball record on Autopilot, you run the risk of an accident that will set back the whole industry ten years. It’s dangerous—” “So if Autonomous Driving is supposed to cut fatalities by 90%, and 38,000 Americans were killed last year, if I become that guy I’m responsible for—” “Killing 34,000 people. Every year. For ten years. 340,000 deaths.” “Legally, the driver is still responsible.” Comms smiled. “Do you really think that matters?” It was a pretty convincing position. So convincing that I promised myself I wouldn’t become that guy, and that the next time I set Autopilot to 90mph I would follow the car's warnings to the letter.

Don't forget to sign up Your Email Address

"The system is in beta. Be prepared to take over at any time. Pay attention." That's right: Pay attention. Then tragedy struck, because a Tesla owner named Joshua Brown didn’t. Whose Fault Was It? This one’s easy. If you believe in a nanny state whose logical conclusion is Wall-E, the fault lies with Tesla. If you believe in personal responsibility, it was Joshua Brown’s. We may never know for sure, but you don’t need to be Nostradamus to know pretty much how this went down. You just need to have spent more than the length of a press junket using Autopilot, both to fall in love with it and to see precisely how it was likely to go wrong. And did. I hate to say it, but it was probably Brown’s fault. How can I be so callous? Because I’ve driven thousands of miles in a Tesla on Autopilot, I was on the team that set the EV and Semi-Autonomous Cannonball Run records in a Model S, I know exactly what it’s like to have faith in a technology so brilliantly executed, and I know how to read a Police Report and use Google Maps. Let’s Get Started “Nobody knows anything,” said William Goldman of the film business, but he could have been talking about media’s coverage of the world’s first Autonomous Driving fatality. I’ve read no more than a handful of intelligent articles on the tragedy. The rest have been the standard bukkake of clickbait reposts. Googling “Tesla accident death” yields 200,000+ stories with Driverless or Self-Driving in the headline. Guess what? A Tesla with Autopilot isn’t a Self-Driving Car. It operates at what’s called Level 2 Autonomy, and Brown—an ex-Navy SEAL, tech executive and self-proclaimed Tesla evangelist—must have known this better than anyone. According to NHTSA, Tesla and anyone who has ever enabled Autopilot via the Tesla UI and used it for more than sixty seconds, it may disengage anytime, and the driver must be ready to take over. Let’s Go To The Map

This was not a complex accident. Brown was headed eastbound on US-27A, traveling at somewhere between 65 and 90 mph—Autopilot’s upper limit. The speed limit is 65 mph. Weather conditions were perfect. There was no indication the Tesla’s brakes were applied before initial impact with a tractor-trailer truck. The police report describes the truck executing a left turn to head south on 140th Court when it was struck mid-trailer by Brown’s eastbound Tesla. Here’s what it looks like in Google Maps, with a just over 1200 feet separating the point of impact from the left edge of the map. There is a small rise approximately 600 feet west of the point of impact, just east of 138th Terrace:

Here’s what it would have looked from Brown’s POV, 1200 feet from the point of impact. The crest is visible halfway to the point of impact:

Here’s what it would have looked like from Brown’s POV, 700 feet from the point of impact, from the top of the crest, just west of 138th Terrace:

Was It Fate? If there is such thing as an unavoidable accident, this would not appear to be one of them. A white tractor-trailer preparing to make a turn would be visible even over the crest at a distance of 1200 feet, and almost doubly so once Brown crested the rise at a distance of 700 feet. Is it possible the truck turned at the last possible second? Of course, but the point of impact suggests the truck had almost completed its turn. No one has suggested the truck stopped mid-turn, which means it was still in motion. A tractor-trailer is a large object, nearly impossible for the human eye to miss even at a distance of a quarter mile. Was it the trucker’s fault? Unlikely. Let’s give Brown the benefit of the doubt and assume he was going 60mph. What is the stopping distance of a Tesla Model S at 60mph? 108 feet. If Brown had been paying attention—whether or not Autopilot had been enabled—he would have had more than enough time to stop the car, had he chosen to. If one of the witnesses is correct and Brown was traveling at 90 mph, he still would have had nine seconds and 1200 feet during which he could have stopped the car, or at least slowed it enough to make the impact easily survivable. But wait, is it possible Brown had a reasonable expectation that his Tesla would “see” the truck and Automatic Emergency Braking would engage? Reasonable, sure. But Brown—an avid Tesla fanboy—surely knew Autopilot’s limitations, especially that Autopilot likes to brake late, and aggressively. If my Tesla were approaching a white truck crossing the highway perpendicular to my path—and I was doing 90, or even 60—I wouldn’t wait for the Tesla to react. My foot would naturally move to the brake pedal. But that could only happen if I was paying attention. The lack of evidence of any braking or steering inputs suggests Brown never saw the truck. He couldn’t have missed it had he been looking up, but Brown was allegedly watching a Harry Potter movie at the time. Fate. Was It Autopilot? Tesla’s critics would have you believe Autopilot is critically flawed, from hardware to software to user interface. That might be true if Tesla claimed to be selling a Driverless/Self-Driving Car, but they’re not. Brown knew this. Their critics would also have you believe Autopilot failed to deliver on the promise of its brand name, suggesting Autopilot doesn’t actually function as an “autopilot.” Guess what? Planes with autopilots still have pilots. Boats with autohelms still have captains. Humans remain “in the loop” because things can go wrong even in a final version. In today's world, there is no final version. Brown knew this. An ex-Navy SEAL would be more likely to understand the need for a human in the loop, especially in a Beta release. To suggest Brown was a victim of aggressive marketing is to insult a man better equipped to understand such technology than 95% of Tesla owners and 99% of journalists writing about the crash.