Uber did not have a formal safety plan in place at the time when one of its self-driving cars killed a woman in Tempe, Arizona, last year, according to a trove of new documents released by the National Traffic Safety Board on Tuesday. Its autonomous vehicles were not programmed to react to people who were jaywalking, and the company had been involved in over three dozen crashes prior to the one that killed 49-year-old Elaine Herzberg in March 2018.

These new details cast a harsh light on Uber’s self-driving vehicle program, which has tentatively restarted testing after shutting down in the wake of the March 18th crash. And they set the stage for a potentially contentious hearing later this month when NTSB will decide the probable causes of the crash.

400 pages of documents

Over 400 pages of documents were released by NTSB, painting a picture of a company where safety lapses, poor staffing decisions, and technical miscalculations converged in Tempe on a deadly night that has since reverberated throughout the AV industry. Up until Herzberg’s death, many companies pursuing self-driving cars were racing to get them on the road as quickly as possible. But now, most operators acknowledge the timeline will be much longer than originally predicted. Still, Uber is likely to avoid any serious repercussions, as the local prosecutor on the case has said she is declining to press charges.

Some of what the board is reporting in these new documents was already known. According to a preliminary report on the crash released by NTSB in May 2018, Uber’s vehicle decided it needed to brake 1.3 seconds before striking Herzberg, but the company had previously disabled the SUV’s factory-set automatic emergency braking system in order to prevent erratic driving.

Now we also know that the vehicle just wasn’t very good at responding to other road users, especially those who are the most vulnerable. According to NTSB, the software installed in Uber’s vehicles that helps it detect and classify other objects “did not include a consideration for jaywalking pedestrians.” The system did detect Herzberg who was walking her bike across North Mill Road outside the crosswalk a few minutes before 10PM. But it classified her as “other object,” not a person.

“As the [automated driving system] changed the classification of the pedestrian several times—alternating between vehicle, bicycle, and an other— the system was unable to correctly predict the path of the detected object,” the board’s report states.

the vehicle just wasn’t very good at responding to other road users

Uber’s decision to disable the Volvo SUV’s built-in automatic emergency braking system has been highlighted as a possible lapse, but safety experts note that it probably makes sense to avoid any conflicts with the company’s self-driving system. However, the NTSB investigation revealed that Uber only built in a one-second delay between crash detection and action to avoid false positives.

Uber’s vehicle detected Herzberg 5.6 seconds before impact, but it failed to implement braking because it kept misclassifying her. Each time the automated driving system came up with a new classification, it had to calculate a new trajectory for the object. A one-second “action suppression” was supposed to hand control back to the operator for manual braking. But if the operator failed to deal with the situation in that one-second interval — which, in this case, she did — then the system is designed to provide an auditory warning that collision is imminent and start a gradual (but not maximum) braking process.

In the months after the crash, Uber has dropped action suppression and now applies maximum emergency braking to prevent crashes. In this new setup, Uber says the vehicle would have braked four seconds early, which implies that it would have avoided killing Herzberg.

Uber only built in a one-second delay between crash detection and action

The March 18th incident in Tempe wasn’t the first time Uber’s self-driving cars had been involved in a crash. Between September 2016 and March 2018, Uber’s autonomous vehicles were involved in 37 “crashes and incidents” while in autonomous mode, the board reports. But Uber’s cars were the “striking vehicle” in only two of those crashes; the majority involved another vehicle striking the autonomous car (33 such incidents; 25 of them were rear-end crashes, and in eight crashes, Uber’s test vehicle was sideswiped by another vehicle).

There were two incidents where Uber’s vehicle was more or less at fault. In the first, the vehicle struck a bent bicycle lane bollard that partially occupied its lane of travel. In the other incident, the safety driver took control of the vehicle to avoid a rapidly approaching oncoming vehicle that entered the ATG vehicle’s lane of travel, striking a parked car. NTSB also reports two incidents when Uber’s vehicles were damaged by passing pedestrians while stopped.

There was also a lack of adequate safety planning by Uber in advance of the fatal crash, the board states. Uber’s Advanced Technologies Group (ATG) had a technical system safety team, “but did not have a standalone operational safety division or safety manager,” the board states. “Additionally, ATG did not have a formal safety plan, a standardized operations procedure (SOP) or guiding document for safety.”

Uber argues that it did have safety policies, procedures, and engineering practices that, in aggregate, could be considered a safety plan, but it acknowledges not having a formal plan in place at the time of the crash. To be sure, there is no federal rule requiring AV operators to have or submit safety plans to the government; there are only voluntary guidelines. Uber released its first safety report in November 2018.

There was also a lack of adequate safety planning by Uber

The NTSB documents also contain notes from an interview with Uber safety driver Rafaela Vasquez. In the interview, she states that Uber’s decision to reduce the number of safety operators in each vehicle from two to one “corresponded with the change in Uber CEOs,” adding that it “seemed to be more a policy decision than an advancement in the technology.”

Dara Khosrowshahi took over as CEO of Uber after Travis Kalanick was ousted in late 2017. At that time, Khosrowshahi was reportedly considering ending the self-driving program but ultimately decided against it. Uber says that the decision to reduce the safety operators from two to one in some vehicles preceded Khosrowshahi coming on as CEO. Since the crash, the company is back to having two operators in each vehicle during testing.

“We regret the March 2018 crash involving one of our self-driving vehicles that took Elaine Herzberg’s life,” a spokesperson said in response to the NTSB documents. “In the wake of this tragedy, the team at Uber ATG has adopted critical program improvements to further prioritize safety. We deeply value the thoroughness of the NTSB’s investigation into the crash and look forward to reviewing their recommendations once issued after the NTSB’s board meeting later this month.”