Tesla is disputing the owner's account of the incident, citing detailed diagnostic logs that show the car's gas pedal suddenly being pressed to the floor in the moments before the collision.

AD

"Consistent with the driver's actions, the vehicle applied torque and accelerated as instructed," Tesla said in a press statement.

AD

At no time did the driver have Tesla's autopilot or cruise control engaged, according to Tesla, which means the car was under manual control — it couldn't have been anyone else but the human who caused the crash. The car uses multiple sensors to double check a driver's accelerator commands.

The Model X owner appears to be standing by his story, but here's the broader takeaway. Cars have reached a level of sophistication in which they can tattle on their own owners, simply by handing over the secrets embedded in the data they already collect about your driving.

AD

Your driving data is extremely powerful: It can tell your mechanic exactly what parts need work. It offers hints about your commute and your lifestyle. And it can help keep you safe, when combined with features such as automatic lane-keeping and crash avoidance systems.

AD

But the potential dark side is that the data can be abused. Maybe a rogue insurance company might look at it and try to raise your premiums. Perhaps it gives automakers an incentive to claim that you, the owner, were at fault for a crash even if you think you weren't. To be clear, that isn't necessarily what's going on with Tesla's Model X owner. But the case offers a window into the kind of issues that drivers will increasingly face as their vehicles become smarter.