This is the latest in an increasing mountain of research showing how machine-learning systems can be attacked and fooled in life-threatening situations.

In an 18-month-long research process, Trivedi and Povolny replicated and expanded upon a host of adversarial machine-learning attacks including a study from UC Berkeley professor Dawn Song that used stickers to trick a self-driving car into believing a stop sign was a 45-mile-per-hour speed limit sign. Last year, hackers tricked a Tesla into veering into the wrong lane in traffic by placing stickers on the road in an adversarial attack meant to manipulate the car’s machine-learning algorithms.

“Why we’re studying this in advance is because you have intelligent systems that at some point in the future are going to be doing tasks that are now handled by humans,” Povolny said. “If we are not very prescient about what the attacks are and very careful about how the systems are designed, you then have a rolling fleet of interconnected computers which are one of the most impactful and enticing attack surfaces out there.”

As autonomous systems proliferate, the issue extends to machine-learning algorithms far beyond vehicles: A March 2019 study showed medical machine-learning systems fooled into giving bad diagnoses.

The McAfee research was disclosed to both Tesla and Mobileye EyeQ3 last year. Tesla did not respond to a request for comment from MIT Technology Review but did acknowledge McAfee’s findings and say the issues would not be fixed in that generation of hardware. A Mobileye spokesperson downplayed the research by suggesting that the modified sign would fool even a human into reading 85 instead of 35. The company doesn't consider tricking the camera to be an attack, and despite the role the camera plays in Tesla’s cruise control, it wasn’t designed for autonomous driving.

“Autonomous vehicle technology will not rely on sensing alone, but will also be supported by various other technologies and data, such as crowdsourced mapping, to ensure the reliability of the information received from the camera sensors and offer more robust redundancies and safety,” the Mobileye spokesperson said in a statement.

Tesla has since moved to proprietary cameras on newer models, and Mobileye EyeQ3 has released several new versions of its cameras that in preliminary testing were not susceptible to this exact attack.

There are still a sizable number of Tesla cars operating with the vulnerable hardware, Povolny said. He pointed out that Teslas with the first version of hardware cannot be upgraded to newer hardware.

“What we’re trying to do is we're really trying to raise awareness for both consumers and vendors of the types of flaws that are possible,” Povolny said “We are not trying to spread fear and say that if you drive this car, it will accelerate into through a barrier, or to sensationalize it.”