A Tesla Model S with the Autopilot system activated was involved in a fatal crash, the first known fatality in a Tesla where Autopilot was active. The company revealed the crash in a blog post posted today and says it informed the National Highway Transportation Safety Administration (NHTSA) of the incident, which is now investigating.

The accident occurred on a divided highway in central Florida when a tractor trailer drove across the highway perpendicular to the Model S. Neither the driver — who Tesla notes is ultimately responsible for the vehicle’s actions, even with Autopilot on — nor the car noticed the big rig or the trailer "against a brightly lit sky" and brakes were not applied. In a tweet, Tesla CEO Elon Musk said that the vehicle's radar didn't help in this case because it "tunes out what looks like an overhead road sign to avoid false braking events."

Because of the high ride-height of the trailer, as well as its positioning across the road, the Model S passed under the trailer and the first impact was between the windshield and the trailer. Tesla writes that if the car had impacted the front or rear of the trailer, even at high speed, the car’s safety systems "would likely have prevented serious injury as it has in numerous other similar incidents."

"Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert."

The accident occurred May 7th in Williston, Florida with 40-year-old Ohio resident Joshua Brown driving. The truck driver was not injured.

Tesla says Autopilot has been used for more than 130 million miles, noting that, on average, a fatality occurs every 94 million miles in the US and every 60 million miles worldwide. The NHTSA investigation, Tesla says, is a "preliminary evaluation" to determine if the Autopilot system was working properly, which can be a precursor to a safety action like a recall.

Our condolences for the tragic loss https://t.co/zI2100zEGL — Elon Musk (@elonmusk) June 30, 2016

In the blog post, Tesla reiterates that customers are required to agree that the system is in a "public beta phase" before they can use it, and that the system was designed with the expectation that drivers keep their hands on the wheel and that the driver is required to "maintain control and responsibility for your vehicle." Safety-critical vehicle features rolled out in public betas are new territory for regulators, and rules haven't been set.

The first fatality in an Tesla in Autopilot mode

Some autonomous driving experts have criticized Tesla for introducing the Autopilot feature so early, with a Volvo engineer saying the system "gives you the impression that it's doing more than it is." In other words, the car handles most situations so smoothly that drivers are led to believe that the car can handle any situation it might encounter. That is not the case, and the driver must remain responsible for the actions of the vehicle, even with Autopilot active. Several automakers working on systems similar to Autopilot — GM with Super Cruise, for instance — have only tested the feature privately and have said they won't deploy until they're ready.

Volvo has said that it will take full legal liability for all its cars when they are operating in fully autonomous mode, and plans to launch a limited trial of its autonomous Drive Me technology next year.

NHTSA issued the following statement to The Verge: