When disaster strikes, past experience has conditioned the public to assume that hardware upgrades or software patches will solve the underlying problem. This indomitable faith in technology is hard to challenge—what else solves complicated problems? But sometimes our attempts to banish accidents make things worse.

Read: The mothers of all disasters

In his 2014 book, To Save Everything, Click Here, the author Evgeny Morozov argues that “technological solutionism”—leaving the answer up to Silicon Valley—causes us to neglect other ways of addressing problems. In The Glass Cage, published the same year, Nicholas Carr points warily to “deskilling,” which occurs when the skills of human operators working a job begin to erode, as automation makes such capacities unnecessary. On average, automation is safer than error-prone humans, so a typical response to deskilling is “So what?”

The specter of airline pilots losing their manual flying skills—or being stripped of the ability to use them—brings to mind the tragedy of the Boeing 737 Max crashes. Investigators reviewing the crashes, which killed 157 people in Indonesia and 189 in Ethiopia, have zeroed in on a software problem in the maneuvering-characteristics augmentation system, or MCAS. MCAS is necessary for the Max, unlike its older brother, the 737-800, because the former sports a redesign that fits larger engines under the wings. The engine on the Max sits farther forward, creating a vulnerability to stalling from steeper climb rates on takeoff. MCAS simply pushes the nose down—and in the process, it transfers control away from the pilots. Pushing the nose down helps when averting a stall, but too much nose-down has fatal consequences.

The 737 Max crashes had other interconnected causes. In Boeing’s case, they reach all the way to financial and corporate zeal in competing with its rival Airbus, financial incentives to save money on expensive fuel, and so on. At any rate, a fix for MCAS is now under way. Everyone learned from the mistake, even as the human cost cannot be rolled back or fixed.

Read: How the bizarre economics of airplanes raises the stakes of the Boeing fallout

What makes the Boeing disaster so frustrating is the relative obviousness of the problem in retrospect. Psychologists and economists have a term for this; it’s called “hindsight bias,” the tendency to see causes of prior events as obvious and predictable, even when the world had no clue leading up to them. Without the benefit of hindsight, the complex causal sequences leading to catastrophe are sometimes impossible to foresee. But in light of recent tragedy, theorists such as Perrow would have us try harder anyway. Trade-offs in engineering decisions necessitate an eternal vigilance against the unforeseen. If some accidents are a tangle of unpredictability, we’d better spend more time thinking through our designs and decisions—and factoring in the risks that arise from complexity itself.