



Jeff Swensen / Getty Images

Human drivers were forced to take control of Uber's self-driving cars about once per mile driven in early March during testing in Arizona, according to an internal performance report obtained by BuzzFeed News. The report reveals for the first time how Uber’s self-driving car program is performing, using a key metric for evaluating progress toward fully autonomous vehicles. Human drivers take manual control of autonomous vehicles during testing for a number of reasons — for example, to address a technical issue or avoid a traffic violation or collision. The self-driving car industry refers to such events as “disengagements,” though Uber uses the term “intervention” in the performance report reviewed by BuzzFeed News. During a series of autonomous tests the week of March 5, Uber saw disengagement rates greater than those publicly reported by some of its rivals in the self-driving car space. When regulatory issues in December 2016 forced Uber to suspend a self-driving pilot program in San Francisco, the company sent some of its cars to Arizona. Since then, Uber has been testing its autonomous cars along two routes in the state. The first is a multi-lane street called Scottsdale Road — a straight, 24-mile stretch that runs through the city of the same name. According to Uber's performance report on tests for the week of March 5, the company's self-driving cars were able to travel an average of 0.67 miles on Scottsdale Road without human intervention and an average of 2 miles without a “bad experience" — Uber’s classification for incidents in which a car brakes too hard, jerks forcefully, or behaves in a way that might startle passengers. Uber described the overall passenger experience for this particular week as "not great," but noted improvement compared to the prior week's tests, which included one "harmful" incident — an event that might have caused human injury. If you have information or tips, you can contact this reporter over an encrypted chat service such as Signal, WhatsApp or Telegram at 732-615-8367. You can also send an encrypted email to priya.anand@buzzfeed.com using the PGP key found here. Find our SecureDrop information here. Uber has also been testing its autonomous vehicles on a "loop" at Arizona State University. According to the performance report reviewed by BuzzFeed News, self-driving cars used on the ASU loop saw “strong improvement” during the week of March 5, traveling a total of 449 miles in autonomous mode without a “critical” intervention (a case where the system kicked control back to the driver, or the driver regained control to prevent a likely collision). The vehicles were able to drive an average of 7 miles without a "bad experience" that might cause passenger discomfort (a 22% improvement over the week prior) and an average of 1.3 miles without any human intervention (a 15% improvement over the week prior). The cars made 128 trips with passengers, compared to 81 the prior week. Uber told BuzzFeed News its disengagements could also include instances when the system kicks back control to a driver, and when the car returns control to a human driver toward the end of a trip. The company declined to comment on the internal metrics obtained by BuzzFeed News and its disengagement rate compared to those of competitors. Uber also declined to say how many miles and hours the vehicles in Arizona drove in total during the week of March 5.

"To take out the safety drivers, you would want far better performance than these numbers suggest."

Bryant Walker Smith, a University of South Carolina law professor and a member of the US Department of Transportation's Advisory Committee on Automation in Transportation, said it’s difficult to draw conclusions about the progress of Uber’s self-driving car program based on just one week of disengagement metrics, adding that the figures suggest that safety drivers appear to intervene regularly out of caution — even in cases where an accident may not be imminent. “To take out the safety drivers, you would want far better performance than these numbers suggest, and you’d want that to be consistently better performance,” Walker Smith said. “If these are actual bad experiences for someone inside the vehicle, then that probably doesn’t compare very favorably to human driving. How often do people go 10 miles or 10 minutes and have a viscerally bad experience?” Uber’s internal metrics are specific to its vehicles in Arizona. The state does not require companies testing there to release data on how their self-driving cars perform. California is the only state that requires companies that test self-driving cars on public roads to submit annual reports detailing how many times they “disengage” from autonomous mode. Because Uber only returned some self-driving vehicles to San Francisco’s roads this month, after its trials were shut down in the state in December for not obtaining the proper permits, it has not yet submitted a public report. But reports submitted by other companies to the California DMV do offer a point of comparison. Alphabet’s Waymo said in a Jan. 5 report filed with the CA DMV that during the 636,000 miles its self-driving vehicles drove on public roads in California from December 2015 through November 2016, human drivers were forced to take control of their self-driving vehicles 124 times. That’s a rate of 0.2 disengagements per thousand miles — or 0.0002 interventions per mile. Uber’s interventions at about 0.67 and 1.3 miles on Scottsdale Road and the ASU loop, respectively, equate to a per-mile disengagement rate of 1.49. But Google’s report also notes that its figures don’t include all disengagements: “As part of testing, our cars switch in and out of autonomous mode many times a day. These disengagements number in the many thousands on an annual basis though the vast majority are considered routine and not related to safety.” (As a comparison to Uber’s testing the week of March 5 in Arizona, here are the CA DMV reports from other companies that tested on public roads in California and reported their statistics to the DMV for December 2015 through November 2016.) Uber CEO Travis Kalanick has called self-driving cars “existential threat” to his ride-hail business. (If a competitor were to develop autonomous vehicles and run an Uber-like service that did not require giving a cut to drivers, the rides would be cheaper.) In February 2015, Uber poached dozens of top roboticists from Carnegie Mellon University to jump-start a self-driving car program. Eighteen months later, Uber launched a pilot program in Pittsburgh that put passengers in the backseats of cars manned by a safety driver and a “copilot” riding shotgun. “Your self-driving Uber is arriving now,” the company wrote on its website. Headlines called it a “landmark” trial, and “the week self-driving cars became real.” Uber’s self-driving program is quarterbacked by Anthony Levandowski, who helped build the first self-driving Google (now called Waymo) car before leaving to create his own startup, Otto. Uber's self-driving program is now embroiled in a lawsuit from Alphabet over allegations that Levandowski stole a crucial part of Waymo’s self-driving technology before leaving. Uber acquired Otto in August, about three months after Levandowski launched the company out of stealth mode. Levandowski became the self-driving program’s fourth leader in less than two years. Kalanick described their relationship as “brothers from another mother,” saying the pair share a desire to move autonomous technology from the research phase to the market. A few weeks after the Pittsburgh pilot launched, Levandowski set a new, ambitious goal for Uber’s engineers, according to an internal planning document viewed by BuzzFeed News: Prepare self-driving cars to run with no humans behind the wheel in San Francisco by January 2017. In the end, in response to concerns raised by engineers who worried the goal was too aggressive, Uber did something far less ambitious. In December 2016, it launched a trial in San Francisco that mirrored its Pittsburgh pilot program: A human safety driver, accompanied by a "copilot,” would man each self-driving Volvo on the road in San Francisco. On its first day, one of the vehicles was caught running a red light. Uber attributed the traffic violation to human error, but the New York Times reported in February that “the self-driving car was, in fact, driving itself when it barreled through the red light.”

“When they let us know they were doing the test, we kind of had to play catch-up because nobody had ever asked us that question before."