BMW’s past promises include a pledge to help keep drivers driving in the brave new world of autonomous vehicles. However, it hasn’t entirely sworn off self-driving technology. The company finds itself in a tricky spot, as it’s seen as both a luxury automaker and a performance brand. But it can’t claim to be “The Ultimate Driving Machine” if it doesn’t allow customers to drive.

Automakers and tech firms pushed relentlessly for autonomous driving, making claims that a self-driving nirvana was just around the corner. But current technology proved less than perfect in practice and modern autonomous vehicles require constant human involvement to operate safely, just like any normal car. Despite making strides, the industry seems torn on how to appease everyone.

The government is even more in the dark. While lawmakers initially agreed with industry rhetoric (that autonomy will save lives and usher in a new era of mobility), recent events sparked skepticism. There aren’t many new regulations appearing in the United States, but there also isn’t any clear legislation to help decide who’s held liable when the cars malfunction. A lot of what if questions remain unanswered.

BMW thinks this will be the main reason why autonomous cars fail.

It’s surprising to hear an automaker says this. The industry seems hell-bent on ramming this technology down our collective throats, consequences be damned. But Ian Robertson, BMW’s special representative in the United Kingdom, says that government regulations will probably stop autonomous features before they can become normalized.

“I think governments will actually say ‘okay, autonomous can go this far,'” he told AutoExpress. “It won’t be too long before government says, or regulators say, that in all circumstances it will not be allowed.”

Roberson said programing a car to make decisions between one life and another is extremely difficult and involves too many moral implications. “Even though the car is more than capable of taking an algorithm to make the choice, I don’t think we’re ever going to be faced where a car will make the choice between that death and another death.”

Meanwhile, Mercedes-Benz says it will always have its autonomous vehicles prioritize the life of the driver in the event of a crash. It’s an interesting problem; one the Massachusetts Institute of Technology has been working on by allowing people to take an ethics-based quiz that forces decisions in a no-win scenario. The test, called Moral Machine, collects data on how people feel autonomous development should progress. It also reveals the problems associated with giving a self-driving car a difficult decision when it comes to who lives and who dies.

BMW isn’t leading the charge in terms of autonomous development, though it does operate several fleets of self-driving vehicles. It’s actively developing the technology. But Robertson knows it can’t make its way to market until it’s objectively error-free.

“..the technology is not mature right now,” he said, “The measure of success is how many times the engineer has to get involved. And we’re currently sitting at around three times [every 1,000km].” While Robertson admitted that sounded promising, he said it was still unacceptable. “It has to be perfect,” he concluded.

Reaching perfection takes time and, even though semi-autonomous systems (like Tesla’s Autopilot) proved impressive, fatal crashes involving that system heightened scrutiny and grew skepticism. Self-driving cars have to operate virtually error-free to gain public acceptance. Pulling that off requires more work and maybe even a complete redesign of our transportation infrastructure — as well as the rules that govern it.

[Image: BMW]