Tesla on Tuesday escalated its media battle with the family of Apple engineer Walter Huang. Huang died in Silicon Valley last month when his Model X vehicle crashed into a concrete lane divider at high speed. Tesla's Autopilot driver assistance system was engaged at the time. Tesla made its clearest statement yet that Huang—not Tesla—bore responsibility for his death on a Mountain View freeway.

Huang's family has hired an attorney to sue Tesla. In an on-camera interview with local television station ABC 7, Huang's wife, Sevonne, said that prior to his death, Huang had complained to her that the car had a tendency to drive toward the exact traffic barrier that ultimately killed him.

But in a statement to ABC 7 on Tuesday evening, Tesla turned this argument around.

"We are very sorry for the family's loss," Tesla wrote. "According to the family, Huang was well aware that Autopilot was not perfect and, specifically, he told them it was not reliable in that exact location, yet he nonetheless engaged Autopilot at that location."

Tesla didn't stop there.

"The crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so."

Especially since the 2016 death of Tesla owner Joshua Brown, Tesla has emphasized that Autopilot is a driver-assistance system rather than a full self-driving system. Drivers are warned to keep their hands on the wheel while Autopilot is engaged.

Unsurprisingly, Huang's family hasn't been happy with Tesla's combative response to his death. "It appears that Tesla has tried to blame the victim," the family's lawyer said to ABC 7 on Tuesday.

Tesla's response is an unorthodox one for a manufacturer facing a fatality involving one of its products. Most companies in Tesla's position would have just stopped at "we are very sorry for the family's loss"—perhaps citing the ongoing government investigation or likely litigation as reasons not to comment further. Engaging in a long-running argument with a grieving widow seems unlikely to improve Tesla's image. And federal investigators have already complained about Tesla talking publicly about the crash before the official investigation has concluded.

But Tesla argues that, despite occasional deaths, Autopilot actually saves lives overall. If people get the mistaken impression that Autopilot is unsafe, Tesla says, there's a danger that fewer people will use it—which could lead to more people dying on the roads overall.

"The reason that other families are not on TV is because their loved ones are still alive," Tesla wrote.

There is some evidence for Tesla's argument that Autopilot makes driving safer. A 2017 study by the National Highway Transportation Safety Administration (NHTSA) found that the rate of accidents for Autopilot-enabled Tesla cars dropped by 40 percent after the activation of the technology. But NHTSA didn't break down the severity of these crashes, or tease out which specific function of its system were responsible for the decrease—for example automatic emergency braking versus lane keeping—leaving open the possibility that Autopilot prevents a lot of minor crashes but is less effective at preventing deadly collisions like the one that killed Huang.