On March 10, an Ethiopian Airlines plane crashed shortly after takeoff, killing all 157 people aboard. The model of Boeing jet that crashed was also involved in another fatal crash less than six months ago, and several countries have grounded their own Boeing 737 Max 8 and 737 Max 9 jets as a safety precaution. Now President Donald Trump has weighed in, announcing an emergency order from the Federal Aviation Administration grounding Boeing 737 Max jets. He also tweeted that “airplanes are becoming too complex to fly” and implied that technological advancements are not as safe as experienced pilots.

Airplanes are becoming far too complex to fly. Pilots are no longer needed, but rather computer scientists from MIT. I see it all the time in many products. Always seeking to go one unnecessary step further, when often old and simpler is far better. Split second decisions are.... — Donald J. Trump (@realDonaldTrump) March 12, 2019

But the truth is much more nuanced than this.

In her classic 1942 memoir “West with the Night,” Beryl Markham, an Englishwoman turned African bush pilot and transatlantic flyer back in the adventurous era of aviation, wrote about what she saw as flying’s eventual future:

By then men will have forgotten how to fly; they will be passengers on machines whose conductors are carefully promoted to a familiarity with labeled buttons, and in whose minds the knowledge of the sky and the wind and the way of the weather will be as extraneous as passing fiction.

The future of aviation Markham foretold is now here.

The machines we call airplanes today are a far cry from what aviators like Markham operated. To hand-fly a jet airliner across the country at 37,000 feet would be unthinkable. The airplane and its digital flight control system are one and the same. We talk about “autopilot,” but that homely old word, with its suggestion of whirring gyroscopes and twitching servo-motors, falls far short of describing what goes on inside the brains of a Boeing or an Airbus. Increasingly, to fly means to issue instructions to a computer. The computer in turn operates the airplane’s controls. In rare but fearsome cases, it may turn on the crew like the mutinous Hal of 2001 fame.

The machines we call airplanes today are a far cry from what aviators like Markham operated. To hand-fly a jet airliner across the country at 37,000 feet would be unthinkable.

Flight control computers are supposed to enhance safety by protecting the airplane from erroneous pilot inputs; at the same time, the human crew is supposed to protect the airplane from misbehavior by its electronics and sensors. When the autopilot fails, the reasoning goes, the human crew will take over and hand-fly the airplane.

Faults in electronic systems are not uncommon, and crews almost always deal successfully with them. Generally, these events are skillfully handled and the general public never hears about them. It is only when a disaster occurs that the public becomes aware of the intricate interdependence of human crews, fragile sensors and opaque computer codes.

The failure of the automatic system may be so sudden, or its misbehavior so severe or so ambiguous, that a crew is paralyzed by bafflement. The combination of startle and fear can render pilots — yes, even “great flying professionals,” in Trump’s phrase — incapable of solving puzzles that they would easily figure out in the comfort of an easy chair at home.

That is what happened when Air France 447 plunged into the South Atlantic in 2009; when an Air Asia flight from Indonesia to Kuala Lumpur went into the South China Sea in 2014 under almost identical circumstances; and when the Lion Air flight last year first called attention to the Boeing 737 Max. It may have been what happened to the Ethiopian Boeing; we don’t yet know enough to say.

After the Lion Air crash, Boeing was roundly criticized for having failed to clearly inform pilots about a new software-based stall protection system. In aviation circles, there was also surprise at the obtuseness of a design that could evidently send an airliner diving into the ground if a single sensor malfunctioned.

Given the immense publicity surrounding the Lion Air crash, it’s hard to imagine that there could still be 737 Max pilots anywhere on earth who were not aware of the so-called MCAS system and of the simple action — basically, switch off the autopilot — required to defeat it.

It is perhaps impossible to understand if you have never experienced how it feels to be the tiny master of a huge machine when that machine suddenly, without warning, begins a suicidal dive.

Still, the fact that the Ethiopian airplane gained very little altitude but a great deal of speed suggests that whatever happened to it may have been similar to what happened to Lion Air. Why, then, did the crew not do what by now every pilot knows must be done?

It is perhaps impossible to understand if you have never experienced how it feels to be the tiny master of a huge machine when that machine suddenly, without warning, begins a suicidal dive. In these situations, your only recourse is to wrestle with it with all your strength while a blur of earth screams past at 400 miles an hour, just a few wing-lengths away, and your brain seems to have turned into a thick molasses though which thoughts struggle to move.

When humans and automatic systems work together in harmony, as they usually do, the relationship between man and machine feels seamless. But when a breach appears, we discover that the interface between them is full of pitfalls.

As a first step, designers of automatic systems must recognize that they cannot expect mere humans to react to trouble with the speed, precision and clarity of digital computers. We just aren’t made that way. But it’s a romantic fantasy to think that “great pilots” are the solution. Great pilots make great mistakes too.