Several weeks ago, my car was driving itself home on a Friday night when I abruptly needed to take over, slam the brakes, and veer into another lane after swiping a nearly stopped car.

Previous Experience with AutoPilot

I have been in situations where the car has accelerated unexpectedly, others where it didn’t recognize someone merging into my lane, and still others where it hasn’t recognized the lanes well at all. Tesla AutoPilot is awesome and getting better with every update, but I know it can’t be trusted completely and am very aware when I am using it. Part of the problem, though, is how much there is to be aware of.

You have all your standard input to be aware of, the windows, mirrors, and speedometer, combined with the dashboard display which includes a host of new information — the lane edges the car senses (along with their different colors and markings each indicating different information), the cars it is tracking in every lane (also with their own colors and meanings), and proximity sensors at every angle.

Example display while autopilot is engaged.

Prelude

I was traveling in the HOV lane which was going a decent amount faster than the primary lanes. It was about 6pm pacific and it was approaching dusk light levels.

At some point I noticed a motorcyclist lane splitting behind me on my right. It was easy to recognize that scenario as an edge case which may not have been well accounted for so I made sure to keep my hands near or touching the wheel at all times, ready to take over in case the car made any sudden movements as the motorcyclist passed me.

It didn’t.

It worked fine and I stared at the dashboard display to see how the car recognized the motorcyclist. The motorcycle was represented on the display as being to the far right in my lane. It was white, indicating the software was tracking it as my lead and there was no other cars on the display (that I recall). I then noticed my car accelerating as it started to keep pace with the motorcycle.

The accident

I looked up and saw the car in front of me nearly stopped while I was still accelerating. I had clearly looked at the display for a second too long and the autopilot software didn’t at all account for the stopped vehicle in front of me.

I slammed on the brakes, recognized the imminent impact and swerved into the right lane only clipping the car in front. Thankfully the other lane was clear. There was minimal damage and no injuries which is just incredible for highway speeds and the amount of traffic on the road.

I can’t recall exactly what speed I was going because the car was regulating that for me and had been accelerating, but I would place the original speed at around 50mph. The car handled very well for the rate of deceleration, the ABS was nearly imperceptible and the car stopped almost completely in time.

Hindsight and Analysis

I don’t blame the car. This is early, opt in software that is trying to take control of a massive real world responsibility. I spent too long (seconds or fractions thereof, mind you) looking in my rear view mirrors and the dashboard display worried about the motorcyclist and I recognize that. Thankfully nothing horrible happened.

I admire Tesla’s ambition and expect the technology to continue to improve. Everything happened very quickly and it was an unlucky coincidence that didn’t go nearly as bad as it could have. I recognized a situation where I couldn’t trust the car’s understanding of the road and, in an unfortunate coincidence, this realization occurred at the exact same time I needed to have already been aware of it. Had I known about the lane-splitting-motorcyclist problem beforehand I would have been less focused on what the car thought of the motorcyclist and more focused on everything the car wasn’t noticing.

Once I recognized the car was stopped in front of me I explicitly remember panicking with the following thoughts going through my head: “Does my car see this? Is it going to do anything? NO. NO IT ISN’T. EMERGENCY.” In retrospect, the actions I needed to take were obvious — I should have regained control immediately. That half of a second or more probably would have made a lot of difference, the problem is that my brain wasn’t primed to have that conversation with itself. Now it is. I’m not looking forward to the comments calling me stupid for not doing this automatically, but I felt like it’s an important topic to be open about. I’d wager we all had a time in our lives where we didn’t know the extent of some technology, trusted it too far, and had to recalibrate after we understood the limits. Now we might just have to be a little bit luckier to get to that recalibration stage.

It is scary, though, to see how one motorcyclist can blind the car so thoroughly…

Oh well. It’s a fun time to be alive as long as the robots don’t kill us first.