The technology is new, but the moral conundrum isn't: A self-driving car identifies a group of children running into the road. There is no time to stop. To swerve around them would drive the car into a speeding truck on one side or over a cliff on the other, bringing certain death to anybody inside.

To anyone pushing for a future for autonomous cars, this question has become the elephant in the room, argued over incessantly by lawyers, regulators, and ethicists; it has even been at the center of a human study by Science. Happy to have their names kept in the background of the life-or-death drama, most carmakers have let Google take the leadwhile making passing reference to ongoing research, investigations, or discussions.

But not Mercedes-Benz. Not anymore.

The world's oldest carmaker no longer sees the problem, similar to the question from 1967 known as the Trolley Problem, as unanswerable. Rather than tying itself into moral and ethical knots in a crisis, Mercedes-Benz simply intends to program its self-driving cars to save the people inside the car. Every time.

All of Mercedes-Benz's future Level 4 and Level 5 autonomous cars will prioritize saving the people they carry, according to Christoph von Hugo, the automaker's manager of driver assistance systems and active safety.

"If you know you can save at least one person, at least save that one. Save the one in the car," Hugo said in an interview at the Paris auto show. "If all you know for sure is that one death can be prevented, then that's your first priority."

So far, the world's transportation regulators haven't taken a position on the autonomous-car version of the Trolley Problem, but nobody doubts that lawyers will swarm in after the first handful of fatalities involving autonomous cars, regardless of whose life the car sets as its top priority.

Pedestrian or other external victims' lawyers will be expected to argue that their loved ones had been murdered by robots, while dead occupants' lawyers might argue they'd been murdered, while doing nothing wrong, by the very machine they'd bought to protect them. The automaker is going to get sued one way or the other, much as human drivers get sued regularly today for the decisions they make. To get around some of this issue, Mercedes-Benz's German rival Audi says it will assume full legal responsibility for any crashes or fatalities from its first Level 3 self-driving car, next year's A8 sedan. Swedish carmaker Volvo has already said it will take up the same legal position when it begins selling self-driving cars in 2020.

Safer All Around, Regardless

According to a U.S. Department of Transportation study, 94 percent of U.S. car crashes are caused by human error. Self-driving cars promise to slash motor-vehicle accident numbers; the National Highway Traffic Safety Administration reported there were 35,092 road fatalities in the United States alone in 2015. Humans have physiological barriers to unwavering concentration, while radar, lidar, sonar, stereo cameras, and a dazzling array of sensor technologies in autonomous cars don't.

As horrific as it is, at least the current road-death toll is relatively democratic, because even the wealthiest drivers lose concentration or make driver-input mistakes. However, it's true that the most expensive and newest cars typically have the best safety equipment. In their early days, autonomous cars threaten to stratify road deaths, protecting the wealthy who can afford the new cars while everybody else continues with flawed organic-piloting systems.

A study released at midyear by Science magazine didn't clear the air, either. The majority of the 1928 people surveyed thought it would be ethically better for autonomous cars to sacrifice their occupants rather than crash into pedestrians. Yet the majority also said they wouldn't buy autonomous cars if the car prioritized pedestrian safety over their own. Which would seem to cut through the issue for anyone whose goal is to sell cars.

But Hugo's thinking, based on years of internal study, short-circuits the circular logic with simpler thinking.

"You could sacrifice the car. You could, but then the people you've saved initially, you don't know what happens to them after that in situations that are often very complex, so you save the ones you know you can save," he argued. In other words, if the car swerves to avoid kids running into the road and instead crashes into something else, it risks the lives of those in the car and cannot predict with certainty what other side effects may follow. Perhaps the car bounces off a pole and hits the kids anyway, or the pole falls over on them, or there's a secondary collision with a loaded school bus coming the other way.

The principle that drivers must stand ready to take control in today's versions of autonomous driving—such as Tesla's Autopilot or Mercedes-Benz's own Drive Pilot—leaves such decisions, at least in theory, to the operator. But Drive Pilot is regarded as a Level 2 system. Hugo was talking about more advanced Level 4 and 5 systems that would have to make such choices without human intervention. And parent company Daimler is clearly engineer-bombing its system to prioritize keeping the owners and occupants of its cars alive.

It has almost finished a $225 million upgrade for its 1235-acre test facility in Immendingen, near the Swiss border, where it has added 300 jobs just in autonomous-vehicle testing.

After sorting through thousands of terabytes of real-world test data and multiples of that in simulations, Hugo said he is convinced that answers to the Trolley Problem will become less important as the advantages of ever-alert autonomous cars become apparent.

"We believe this ethical question won't be as relevant as people believe today. It will occur much less often," Hugo said. "There are situations that today's driver can't handle, that—from the physical standpoint—we can't prevent today and automated vehicles can't prevent, either. [The self-driving car] will just be far better than the average [human] driver.

"This moral question of whom to save: 99 percent of our engineering work is to prevent these situations from happening at all. We are working so our cars don't drive into situations where that could happen and [will] drive away from potential situations where those decisions have to be made."

Even so, Hugo insists the technological ability of the first-generation Level 4 autonomous vehicle won't be as important as the way people interact with it.

"With the virtual phase, it's not just about the technology and no malfunctions, but the people and how they do things and react."

Hugo said Daimler has more than 1000 people test-driving its simulators, exploring various aspects of highly autonomous driving. For instance, the company is making sure the testers understand how to switch the cars into and out of self-driving mode and determining whether there's any confusion about what the car will do for itself and when the driver has responsibility. Asked how many miles the test subjects have logged, he said that is irrelevant.

"It's not about miles," he said. "It's about situations, and there are an infinite number of them."

This content is created and maintained by a third party, and imported onto this page to help users provide their email addresses. You may be able to find more information about this and similar content at piano.io