A driver of a Tesla Model S died in early May when both he and his car's semi-autonomous Autopilot system failed to see a tractor trailer crossing in front of the car on a highway in Florida.

While it seemed like the perfect storm of real-world chance versus automotive high tech, it turns out not just an unlikely blend of coincidence could cause Tesla's Autopilot to miss an obstacle.

Researchers at the University of South Carolina and China’s Zhejiang University, in collaboration with the Chinese security firm Qihoo 360, say they've both easily blinded Autopilot and also caused it to see phantom obstacles.

When they take the stage later this week at the Defcon hacker conference, the researchers will detail how they used relatively accessible (but expensive) devices to deceive Tesla’s Autopilot sensors, according to Wired.

Intriguingly, the researchers, turned white-hat hackers, didn't actually have to hack the car. They simply jammed the stationary test car's front-mounted radar, ultrasonic sensors and cameras by exposing them to various machines that emitted light, radio and sound.

As you can see in the video above, Autopilot suddenly and without warning loses track of the car ahead when the radio interferer is activated.

What's more, the researchers created the same blinding effect by draping a car with acoustic foam, which is far cheaper than devices with five- and six-figure price tags.

Granted, the machines employed by the researchers are expensive — and wrapping something in acoustic foam would be obvious. So, it's unlikely they would be used by would-be hackers to impair a single car. However, the tests prove that there are more issues with Tesla's Autopilot system than was initially anticipated.

Have something to add to this story? Share it in the comments.