After announcing that all future Teslas would have the hardware necessary to reach full level five autonomous driving, CEO Elon Musk took questions. One of the early ones was straightforward: if an autonomous car crashes, will Tesla be liable? Specifically, would it “offer indemnity” to customers like Volvo plans to?

Musk’s answer was essentially no, unless the software itself was directly at fault. And then he decided it was a good time to point out something else: anything that keeps us from adopting driverless cars is, in effect, killing people. And so, negative stories about Tesla Autopilot crashes make him pretty angry. Here’s his full quote:

No I think it would be up to the individual’s insurance. … If it’s something endemic to our design, certainly we would take responsibility for that. Once you view autonomous cars sort of like an elevator in a building, does Otis take responsibility for all elevators around the world? No, they don’t. What really matters here at the end of the day is “what is the absolute safety.“ One of the things I should mention that frankly has been quite disturbing to me is the degree of media coverage of Autopilot crashes, which are basically almost none relative to the paucity of media coverage of the 1.2 million people that die every year in manual crashes. [It is] something that I think does not reflect well upon the media. It really doesn’t. Because, and really you need to think carefully about this, because if, in writing some article that’s negative, you effectively dissuade people from using an autonomous vehicle, you’re killing people. Next question.

Now, I would argue that reporting on such crashes (and deaths) is important insofar as it’s literally a new cause of death from a new technology — even if that technology is safer than a human driver (or has the potential to be).

Also, autonomous cars probably ought to be proven safer before Musk’s argument is given real weight. And hey, Musk has a plan for that, too: Shadow Mode, where Tesla will watch how you drive and judge whether the car itself would have driven more safely. But I won’t get into the ethics of it too deeply here, only point out that Musk’s argument does have some merit.

And speaking of complex ethics: Musk’s answer about indemnity bears a little more scrutiny. Tesla will take responsibility for “something endemic to our design,” but where that responsibility ends and the responsibility of a person just sitting in a car begins is, well, totally unclear to literally everybody. So it’s not a knock on Musk that his answer to the indemnity question is a little unclear. But if it doesn’t get cleared up eventually, it will be a knock on people’s trust of autonomous systems and the companies that provide them.