Hackers can fool self-driving cars into thinking obstructions such as cars, other people and walls are in front of them, forcing the vehicles to take action to evade the non-existent objects.

This would involve the car slowing down or stopping to avoid a collision and means, of course, that automated cars may be fundamentally insecure, vulnerable to cyber criminals armed with nothing more complex than a $60 low-power laser and pulse generator, according to IEEE Spectrum.

The pointer can compromise the reliability of the laser ranging system (lidar) used by self-driving cars to map the world around them. The vulnerability was discovered by researchers at software security company Security Innovation, and will be presented at the Black Hat Europe security conference later this year.

Jonathan Petit, principle scientist at Security Innovation, explained: "I can take echoes of a fake car and put them at any location I want. And I can do the same with a pedestrian or a wall.

"It's kind of a laser pointer, really," he added of the technology needed to hack the cars. "And you don't need the pulse generator when you do the attack. You can easily do it with a Raspberry Pi or an Arduino. It's really off the shelf."