This belief that technical accomplishment in areas of far-reaching impact is not unique to Levandowski's coworkers, the Google Self Driving Car Project or even Google. Buried in Silicon Valley's lionization of "the crazy ones," established as high-tech orthodoxy by Apple's memorable "think different" ad, is an implicit ends-justify-the-means morality that has simmered even as the tech sector has accumulated unprecedented financial and cultural clout. The logic isn't even unique to The Valley, as badly-behaved rainmakers have enjoyed protection and prestige everywhere from Wall Street to Washington DC to Hollywood, but the technology business has gone farther in embracing and normalizing an explicit tradeoff between genius and rectitude than anywhere else. The case of Andy Rubin, the "father of Android" who Google paid some $90 million while protecting him from sexual harassment charges, shows just how deeply embedded this tradeoff is at Silicon Valley's biggest companies.

There may even be good reasons for this moral ambivalence, especially in areas like software that depend on breakthroughs made by sheer force of individual talent. But as technology companies have moved into businesses and functions whose actions carry deeper public consequences this amorality is increasingly being confronted, as witnessed by the controversy over Facebook's role in foreign manipulation of the 2016 election. It also needs to be far more thoroughly confronted if the high tech sector wishes to bring its uniquely freewheeling culture into mobility, particularly automobility, where a lax sense of morality can have life and death consequences.

This was made abundantly clear in 2018, when Uber's steroidal version of Silicon Valley's "move fast and break things" culture combined with a venture capital-fueled "race to autonomy" to contribute to the unnecessary and tragic death of Elaine Herzberg. “If it is your job to advance technology, safety cannot be your No. 1 concern,” Levandowski said of AV development, echoing utilitarian sentiments expressed by Elon Musk, Lex Fridman, and other luminaries of the AV world who seem to see human deaths as the inevitable and worthwhile price of autonomous drive capabilities. Given that the people calling for social acceptance of road deaths in the name of AV development have so much to gain (financially and otherwise) from the race to autonomy, Levandowski's intransigence raises tough questions. How much are these moral mavericks willing to sacrifice in the pursuit of technology that may (or may not) change the world but will definitely greatly enhance their social and financial capital?

Just as Levandowski is reaping the bitter harvest of his belief that the ends of autonomous drive technology justifying any means, the other exponents of this utilitarian approach are already finding that it's not well-adapted to the life-and-death stakes of public roads. Uber's rush to rack up testing miles contributed to the Herzberg fatality, which has set back its entire testing and development program and whose fallout has rippled through the entire sector and resulted in a lawsuit against Arizona's governor and the city of Tempe, who had embraced a laissez-faire attitude toward regulation. Software professional Blaine Osepchuk has raised a number of serious ethical issues he sees for programmers working on Tesla's Autopilot, which has seen intense staff turnover since Autopilot deaths started to be reported.

This feedback loop explains the sector's collective step back from the ambitious goals that characterized the "race to autonomy," but it would be a mistake to assume that Silicon Valley is done confronting its problematic relationship with its most gifted rockstars. At some companies the lesson seems to have sunk in: for example, three of Aurora's engineers unanimously bridled at the idea of "ninjas" or "rockstars" when it came up in a conversation I had with them as part of the Autonocast. Building a culture of modest and sober responsibility is not easy in what is still one of the hottest and most talent-starved sectors in tech, but any short-term sacrifices will prove to have been well worth it when it finally comes time to ask the public to trust the safety of robotic chauffeurs.

Levandowski's morality play will have other impacts on the autonomous drive technology world, including on unresolved ambiguities around talent poaching that have festered in The Valley for years, but its most important lesson is fundamentally about culture. With few remaining doubts that autonomous science projects can share public roads, AV development from here on out is fundamentally a pursuit of safety and trust. The risk now is that Levandowski is seen as such an aberrant outlier that the rest of the sector fails to confront the utilitarian cultural inheritance that infects the entire high-tech sector, and more Levandowskis persist as ticking time bombs that could damage or even destroy the utopian promise of self-driving technology.