The family of a man killed when his Tesla hit a truck while under the control of its autopilot function may have legal grounds to sue the company, legal experts say.



Joshua Brown died on 7 May after his Model S hit the side of a truck while in so-called “autopilot” mode, in the first known death involving a self-driving car.

Tesla’s system is not in fact fully autonomous – it is described in with warning notices as “traffic-aware cruise control” and reminds drivers to keep their hands on the steering wheel at all times. Despite these warnings Tesla has said that there was no reaction either by the software or the driver before the crash, and the driver of the truck has claimed he heard a Harry Potter movie playing in the car after the crash..

The Florida sheriff’s department confirmed that a portable DVD player was found in the car, but it is not known for sure whether Brown was watching a film at the time of the crash, which is still under investigation by the US National Highway Traffic Safety Administration.

In a blogpost about the accident, Tesla was careful to outline the safety warnings that are given before its autopilot feature can be used. “Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled,” it said.

Yet legal experts have told the Guardian that Brown’s family could have a legal case based on the fact that, whatever warnings may have been offered, the driver may have been led to believe the system was more capable than it was.

According to Anthony Johnson, an attorney and the CEO of the American Injury Attorney Group, Brown’s family “absolutely” have a product liability case against Tesla, based on the question of whether Brown was adequately warned about the potential defects in the autopilot system.

“I anticipate that there will be strong arguments that the warnings given by Tesla are far from sufficient to exculpate them from liability,” he said.

“The term ‘autopilot’ has been used for decades and is understood by the masses as a situation whereby the machine (typically airplanes until recently) pilots the vehicle for the operator,” Johnson said. “You can’t sell something at the grocery store that looks like a tomato and is labeled ‘tomato’ and place in the fine print that it’s actually a grape.”

Not all legal experts agree with Johnson. “If there was a sensor, if they had the safety features in place and [Brown] ignored them, it would make a huge difference in being able to pursue a case, because at that point they’re shifting the burden from themselves to the user,” said Farid Yaghoubtil, a personal injury attorney in Los Angeles.



You can’t sell something that is labeled ‘​​tomato’ and place in the fine print that it’s actually a grape Anthony Johnson, attorney

Asked if Joshua Brown’s family were planning to sue Tesla, his father Warren declined to comment.

Tesla’s blogpost was at pains to point out that the car reminds you to “always keep your hands on the wheel”, and saying that the system “makes frequent checks to ensure that the driver’s hands remain on the wheel”, making visual alerts and slowing the vehicle until hands are detected.



But their warnings were at odds with the impression given by Tesla’s founder Elon Musk, who has said that Tesla’s autopilot is “probably better than humans at this point in highway driving” – part of an attitude which experts have said may have masked the risks of relying too heavily on the system.

There is also a huge difference between legal jurisdictions, according to Bryant Walker Smith, an assistant professor of law and engineering at the University of South Carolina. “In one state somebody who is themselves careless but injured might receive nothing, where in another someone might recover nearly all their damages,” he said.

Bryant said that Tesla would be sure to point to the driver’s actions in any legal case. “They could say he accepted the risk, assumed the risk and therefore is wholly or at least partly culpable,” he said. “They would ... point to the warnings they gave on prior use, point to the fact that this was a beta that required supervision, argue that the acts of the driver was the predominant cause of the crash.”

At that point, he said, the case would come down to “reasonableness – was Tesla reasonable, was its design reasonable, was the driver reasonable? And the jury ... would balance these and come back with a verdict.”