Autonomous vehicles - cars that can effectively drive themselves and have an “autopilot” mode - are in the process of being trialled across the Hexagon, after the government allowed the move in August 2016.

Early last year, French manufacturer PSA was among the first to begin trials on public Paris motorways as part of the “Autonomous Vehicle for All (AVA)” project.

But while PSA invited drivers to use the cars as human “backups” - normal drivers who can operate the car if needed - other companies including parts manufacturer Delphi and transport company Transdev are aiming to trial vehicles without any human drivers at all.

This year, the group is set to begin operating self-driving Renault Zoe taxi models in Rouen (Normandy), and on a self-driving shuttle van in the university district of Paris-Saclay. The vehicles can still be controlled by a human in a central dispatch centre if needed, but are otherwise completely self-driving.

Yet, La Sécurité Routière has said that should the vehicles become legal and more common, the cars themselves should be forced to pass a proper driving test before being allowed to take to the roads.

An agency spokesperson said: “We expect human drivers to learn and submit to the rules of the road. Why not require the same of cars?”

In the proposed test, the cars would be set to “autopilot” mode and be required to perform a normal driving examination, in the same way as if a human were driving. Manoeuvres, different driving speeds, parking, and navigation would all be under scrutiny.

In vehicles that allow it, a human being would also be behind the wheel, but would be under strict instructions not to interfere with the test. In vehicles without human backup, the examiner would be the only human in the car.

Should the human (either the driver or the examiner) need to intervene - in order to correct a manoeuvre or avoid an accident, for example - the autonomous car would automatically fail the test.

According to the proposals, should the car pass, it will receive its own driving licence (in the same way as a human), and be required to display it at all times.*

Supporters of self-driving cars say that they actually prevent accidents, as they are safer than a driver who is distracted, tired, angry or drunk. Figures from financial services company KPMG have suggested that self-driving cars could prevent up to 150 deaths a year by 2020.

France suffered 3,693 road deaths in 2017.

And yet, self-driving cars have come in for criticism recently.

Just this month in California, a Tesla set to autopilot mode crashed and killed its passenger/driver, when it failed to spot an oncoming lorry. Records show that despite the car sending several visual and audible warnings ahead of the collision, the human driver took no action to avoid the crash.

Similarly, in Arizona, governor Doug Ducey has suspended all testing of self-driving cars by taxi app service Uber, after one of its vehicles hit and killed a pedestrian who was walking her bike across a dark road.

The car, which was in autonomous mode, had a backup driver inside - trained self-driving car operator Refaela Vasquez - but questions have emerged about whether she was paying enough attention before the crash, and if the car’s sensors were working correctly.

Yet, Tesla has defended its safety record, saying that its Autopilot mode is actually on average far more safe than a human driver, and, if applied uniformly, could save up to 900,000 lives a year across the world.

*While most of this story is accurate, the part about La Sécurité Routière requiring cars to pass their own driving test is, in fact, an April Fool. Happy Easter!