The problem affects the image recognition system used by most self-driving car cars as explained in a research paper, titled “Robust Physical-World Attacks on Machine Learning Models.”

“Given these real world challenges, an attacker should be able to account for the above changes in physical conditions while computing perturbations, in order to successfully physically attack existing road sign classifiers. In our evaluation methodology, we focus on three major components that impact

how a road sign is classified by, say, a self-driving car. ” reads the paper.

The experts demonstrated different tricks to interfere with the mechanisms implemented in modern self-driving cars to read and classify road signs, just using a color printer and a camera.

In the Camouflage Graffiti Attack, the experts added simply stickers with the words “Love” and “Hate” onto a “STOP” sign. The autonomous car’s image-detecting algorithms were not able to distinguish the road signs and interpreted them as Speed Limit 45 sign in 100 percent of test cases.

A similar camouflage was tested on a RIGHT TURN sign and the cars wrongly classified it as a STOP sign in 66 percent of the cases.