Joshua Brown was reportedly watching a Harry Potter film when the "autopilot" function of his Tesla Model S drove the car into a truck, instantly killing him. This incident, the first time someone died from an accident involving self-driving car technology, doesn't prove autonomous vehicles are inherently unsafe. But it does serve as a reminder that the technology isn't yet at the stage where we can take our eyes off the road.

Prior to Tesla's crash a few weeks ago, I interviewed 12 experts from across the driverless car world as part of my PhD research project looking at issues of trust in automated vehicles. The experts included academics and industry professionals, all with vast experience in automated vehicle technology and design. They said, as research suggests, that until the autopilot can fully take over, we need to treat self-driving cars with a certain amount of distrust, otherwise we could be putting ourselves in serious danger. The issue was so serious that drivers shouldn't be allowed to use this technology without specific training.

Joshua Brown's death is believed to be the first behind the wheel of a car in self-driving mode. Credit:AP

Moving up the gears

Although it's on the horizon, a future where humans have no interaction with a car's driving system (known as level 5 automation) is still far away. To get there, we need to get through several transitional stages of technology (levels 2, 3 and 4) that can take increasing but not complete control of a vehicle. Tesla's autopilot, for example, is typically seen as level 2 to 3 automation because it can take control of at least two functions such as steering and acceleration, but still needs the driver to monitor the car and its environment.