Cars that can drive themselves risk lulling people in the driver's seat into a false sense of security, and even to sleep, scientists warn.

But they believe providing distractions that are currently illegal, such as reading while at the wheel, may keep 'drivers' of autonomous vehicles alert enough to take control if needed.

Automakers are racing to bring the first self-driving cars to market but they are divided about how much 'driving' humans be doing in the future.

Cars that can drive themselves risk lulling people in the driver's seat into a false sense of security, and even to sleep, scientists warn. But they believe providing distractions that are currently illegal, such as reading while at the wheel (stock image), may keep 'drivers' of autonomous vehicles alert enough to take control

In one experiment, researchers put Stanford University students in a simulated self-driving car to study how they reacted when their robo-chauffeur needed help.

Among the 48 students, 13 who were instructed to monitor the car and road from the driver's seat began to nod off.

Yet only three did so when told to focus on a screen full of words or moving images.

Alertness was particularly helpful when students needed to grab the wheel because a car or pedestrian got in the way in the simulation.

The experiment is one in a growing number that assesses how cars can safely hand control back to a person when their self-driving software and sensors are overwhelmed.

With some models already able to stay in their lane or keep a safe distance from other traffic, and automakers pushing for more automation, the car-to-driver handoff is an open question.

Self-driving car experts at Google, which is pursuing the technology more aggressively than any automaker, believe involving humans would make its cars less safe. An autonomous Google car is shown above

In one experiment, researchers put students in a simulated self-driving car to study how they reacted when their robo-chauffer needed help. Among the 48 students, 13 who were instructed to monitor the car and road from the driver's seat began to nod off. A stock image of a woman sleeping at the wheel is shown

So the elimination of distracted driving is a major selling point for the technology.

While there's no consensus of the right car-to-driver 'handoff approach', the Stanford experiment showed reading or watching a film helped keep participants awake.

Self-driving car experts at Google, which is pursuing the technology more aggressively than any automaker, believe involving humans would make its cars less safe.

Google's solution is a prototype with no steering wheel or pedals. Instead, human control would be limited to go and stop buttons.

Meanwhile, traditional automakers are gradually introducing the technology.

Mercedes and Toyota sell cars that can hit the brakes and stay in their lane, for example, meaning self-driving cars could become reality within a decade.

It could be hard for pedestrians to read the intentions of autonomous cars, but in a bid to solve the problem before it occurs, Google has patented the idea for a car that 'talks' to pedestrians. A drawing from the patent shows a vehicle flashing a sign that says 'safe to cross'

One potential hazard of this gradualist approach became clear this autumn when Tesla Motors had to explain that its 'auto pilot' feature did not mean drivers could stop paying attention.

Several videos posted online showed people recording the novelty, before seizing the wheel when the car made a startling move.

Next year Cadillac will introduce a 'super cruise' system in its CTS model. This will monitor whether a driver's eyes have wandered from the road and if they don't respond, the vehicle will slow down on its own.

'We are in no way selling this as a technology where the driver can check out,' General Motors spokesman Dan Flores said.

'You can relax, glance away, but you still have to be aware because you know the technology's not fool-proof.'

SELF-DRIVING CARS ARE FIVE TIMES MORE LIKELY TO CRASH Self-driving cars are more accident-prone than ordinary cars, a new study has claimed. Researchers found self-driving vehicles had 9.1 crashes, compared to just 1.9 for those with a human operator. However, the report found that none of the accidents were the fault of the self-driving car. Self-driving cars were rear-ended 50 per cent more often than traditional vehicles, according to the University of Michigan's Transportation Research Institute. It concluded 'self-driving vehicles were not at fault in any crashes they were involved in.' Researchers analysed the cumulative on-road safety record of self-driving vehicles for three of the ten companies currently approved for such vehicle testing in California - Google, Delphi, and Audi. Self-driving cars are more accident-prone than ordinary cars, a new study has claimed. Researchers found self-driving vehicles had 9.1 crashes, compared to just 1.9 for those with a human operation. An autonomous Google car is pictured above They then compared the safety record of these vehicles with the safety record of all conventional vehicles in the US for 2013. Overall, they said 'the distance accumulated by self-driving vehicles is still relatively low, about 1.2 million miles, compared with about 3 trillion annual miles in the US by conventional vehicles.' They also pointed out that tests have been confined to 'safe' areas, so 'Self-driving vehicles were thus far driven only in limited (and generally less demanding) conditions (e.g., avoiding snowy areas).' The study's authors, Brandon Schoettle and Michael Sivak, concluded: 'We currently cannot rule out, with a reasonable level of confidence, the possibility that the actual rate for self-driving vehicles is lower than for conventional vehicles. 'The current best estimate is that self-driving vehicles have a higher crash rate per million miles travelled than conventional vehicles, and similar patterns were evident for injuries per million miles travelled and for injuries per crash. Advertisement

Though research is ongoing, it appears people need at least five seconds to take over – unless they have fallen completely asleep.

Automakers face a real challenge of how to get drivers to accept and trust autonomous technology in cars, but not be lulled into a false security that makes them slow to react when the car needs them.

In a bid to keep people alert at the wheel, cars will have to appeal to several senses and visual warnings alone may not be enough to attract the attention of a distracted driver relying on an autonomous function.

Cadillac will next year introduce a 'super cruise' system in its CTS model (shown). It will monitor whether a driver's eyes have wandered from the road and if they don't respond, the vehicle will slow down on its own

For example, flashing lights and spoken instructions or strong vibrations could be used to get people to take control of a vehicle swiftly.

Greg Fitch, a research scientist at the Virginia Tech Transportation Institute said: 'If it [a warning] is done courteously and subtle and not annoying, it could be missed by someone that is distracted.

He believes manufacturers will have to be careful not to overload drivers with warnings and avoid 'mode confusion' by making it clear when a driver, and when the car, is in control.

While cars equipped with sensors may be better at responding more quickly than humans, people are better at making decisions under uncertain circumstances.

One lesson from the Stanford study may be that master and machine are better viewed as collaborators.

'There's really a relationship between drivers and cars,' said David Sirkin, who helped run the experiment at Stanford's Centre for Design Research, 'and that relationship is becoming more a peer relationship.'