Photo : Connecticut State Police

Just in case you needed a little reminder as to whether or not you lived in a fictional 2019 where we have fully autonomous, self-driving vehicles or the actual 2019 where we have, at best, partially self-driving vehicles that require constant driver attention, this should be a good reminder: O ver the weekend, a Tesla Model 3 with Autopilot engaged crashed into the back of both a police car and another vehicle. This is because the 2019 we all live in is not one where we have truly self-driving vehicles on the road. Sorry.


The wreck happened early on Saturday, December 7, near Norwalk, Connecticut. The Tesla Model 3, with a license plate that helpfully reads “MODEL3,” just so you really know what car it’s bolted to, had its Autopilot system engaged at the time of the crash.


Despite what many, many people seem to think and some frankly confusing terminology on Tesla’s website and marketing materials, Autopilot is not a fully self-driving system.

As Tesla likes to remind us every time this happens, this is how they describe the system’s use themselves:

“Before enabling Autopilot, the driver first needs to agree to “keep your hands on the steering wheel at all times” and to always “maintain control and responsibility for your vehicle”. Subsequently, every time the driver engages Autopilot, they are shown a visual reminder to “keep your hands on the wheel”... Autopilot is intended for use with a fully attentive driver who has their hands on the wheel and is prepared to take over at any time.”

Requiring the driver to be ready to take control at a moment’s notice is a defining characteristic of a Level 2 autonomous driving system, which is not autonomous at all, but really more of an advanced driver assist.

That means doing things like “checking on [your] dog” in the back seat while the car is driving at highway speeds is an absolutely idiotic idea, and exactly what this driver did, which is why they weren’t able to take control of the car when it was clear it would be running into the stopped police car and the disabled car in front of the police car.


Keep in mind, this was not a situation that would have confused a human driver: the cars were stopped, with their hazard lights on, along with flares set to warn drivers that the cars were there, immobile. These were hardly hidden cars, and not unusual circumstances in the slightest.

The driver was charged with Reckless Driving and Reckless Endangerment because if you’re driving a car, you need to be paying attention, dummy, even if your love for Elon Musk and Tesla is so powerful and real that you can just feel it, deep inside you, where music is born.


If you still think that Tesla’s Autopilot system is close enough to being fully self-driving, let’s try an analogy: if you had a chauffeur that was an excellent driver 80 percent of the time, but, just so you know, was also a narcoleptic and could fall dead asleep without warning at any moment, would you be okay with being driven around by them? I’m not so sure I would.

That’s what’s going on here. Autopilot has no graceful fail-over; if something fails to work like it should, for any number of reasons, it needs a human at the wheel paying attention to take immediate control. The system may not even be aware there’s a problem until way too late—that’s why it needs your practiced, moist, human driver’s eyes watching as well.


This particular wreck seems to have some similarities to a known Tesla Autopilot issue with stationary cars, as seen here:

Remember, it’s a driving assist system. It’s not self-driving. So pull over to check on your dogs.


If you stop, it’ll be much easier to ask who’s a good boy/girl, too, and you need that information.