Self-driving cars, like any other piece of tech, were first seen in sci-fi movies. They soon gained attention and many tech giants started working on them. Elon Musk, the face of Tesla, claims to have fully automated cars by the end of 2020. But how far are we from a driverless future?

Prepare to be disappointed. No denying that we have come a long way in the field of self-driving cars- with companies like Waymo launching driverless taxis, Tesla launching autonomous abilities in their cars like lane changing- completely automated cars by the next year is just too good to be true.

The problem with full automation is not technical, the technology here is brilliant and is only going to get better with time. The problem is philosophical and societal values. Take the ‘trolley problem’ for example. It is a very famous thought experiment where taking no action might lead to the death of certain people, and taking an action might lead to the death of several other people. Now, if you were given a choice, you would go for the way which kills a smaller number of people. Now, think of a situation where you are in a driverless taxi and in front of your vehicle are a few kids playing a game on their mobile phones. The car will now have to make a decision- take a left and get right into the oncoming traffic, which will probably kill you, or go straight ahead and kill a bunch of children. If you were driving, probably your basic human instincts would jump in and you might sacrifice yourself to save the kids, but the car will do the opposite- run over the children and probably kill them and save you because no car company will ever sell a car which cannot ensure your safety.

Now such a model will not be accepted by most countries. The problem? Philosophical. What AI lacks is randomness and basic human instinct. Unless we can ensure and provide the AI with those instincts, fully automated cars are not picking you from your place.