There are several different approaches to developing self-driving cars. Some companies are gradually releasing more and more advanced autonomous and semi-autonomous features leading to a fully autonomous system, like Tesla’s Autopilot, and others, like Google, are aiming to only release a system once the technology is ready for a fully (level 4) self-driving car.

Following a few recent accidents in the past weeks, Tesla received some criticisms over its approach on releasing semi-autonomous features. One of the most severe criticism came from renowned scientist Andrew NG who said that it was plainly “irresponsible” for Tesla to ship the Autopilot.

Ng is now Baidu’s chief scientist working on research in deep learning and in scalable approaches to big data and AI. He works out of Baidu’s R&D center in Silicon Valley where the company also develops its own self-driving car technology.

He is best known for being a co-founder of Google’s Deep Learning project and online learning startup, Coursera. He is also a professor at Stanford’s computer science and electrical engineering departments.

Last week, we reported on an accident in Switzerland where a Tesla Model S owner crashed into a van while using his car’s active cruise control feature, which is part of the Tesla’s Autopilot system. Fortunately, no one was injured because of the collision. While it’s clear that the Autopilot wasn’t used as recommended by Tesla in this particular situation, it’s not as obvious why the Automatic Emergency Brake was not activated.

Ng commented on the accident via his Twitter account:

It's irresponsible to ship driving system that works 1,000 times and lulls false sense of safety, then… BAM! https://t.co/cbmc8onoKu — Andrew Ng (@AndrewYNg) May 27, 2016

By “false sense of safety”, Ng is referring to driver’s explanation of why he didn’t apply the brake himself even though he saw the stalled vehicle in front of him:

“Yes, I could have reacted sooner, but when the car slows down correctly 1’000 times, you trust it to do it the next time to. My bad..” – the driver said when publishing a short video of the accident.

As we discussed when we first reported on the accident, Tesla says to always monitor the vehicle when on Autopilot and to be ready to take over the controls at all time. In its owner manual, the automaker even refers to a situation very similar to what the driver in Switzerland experienced and asks to “prepare to take immediate corrective action”.

It’s still difficult to agree with Ng’s claim of creating a “false sense of safety” when Tesla asks drivers to “prepare to take immediate corrective action” and even recommend to keep their hands on the wheel – it doesn’t do much to create a sense of safety. It doesn’t seem fair for Ng to ask not to ship the Autopilot, which is extremely useful to tens of thousands of Tesla owners, just because a few of them are abusing the system.

Featured Image: A member of the media test drives a Tesla Motors Inc. Model S car equipped with Autopilot in Palo Alto, California, U.S., on Wednesday, Oct. 14, 2015. Photographer: David Paul Morris/Bloomberg via Getty Images

FTC: We use income earning auto affiliate links. More.

Subscribe to Electrek on YouTube for exclusive videos and subscribe to the podcast.