You could call it an example of clueless bureaucracy standing athwart the path of progress. But you’d be picking a fight with Nicholas Carr, author of “The Glass Cage: Automation and Us,’’ a sobering new analysis of the hazards of intelligent technology.

As part of its radical effort to flood the planet’s roads with self-driving cars, Google this year unveiled a prototype without a steering wheel, accelerator, or brake pedal. Technology advocates were awed. Not so the California Department of Motor Vehicles, which insisted it would not allow the cars to be tested on the state’s roads unless the humans inside could take control of them.


Carr, former executive editor of the Harvard Business Review, made a name for himself through a 2008 Atlantic magazine article warning that Google’s Internet search service was making us all morons. He surely had a point. Access to instant information on any topic tempts us to believe we know more than we really do. At the same time, it discourages any effort at deeper study. When we need to know more, we’ll Google it. Till then, where’s the remote?

By examining the fallout of a series of technological advances, Carr makes the case that millions of us will have lots of time for television, as machines are becoming smart enough to do the high-skill jobs we once believed “computer-proof.” And he fears that all of us will see our skills eroded, our intelligence debased, and our work devalued, if we sacrifice human responsibility to black boxes full of microchips.

Of course, technology’s been making less-skilled jobs obsolete ever since the Industrial Revolution. But these days, machines are coming for the well-paying jobs we perform in air-conditioned offices or clean, well-lighted factories, or the cockpits of commercial aircraft.

Yes, there are benefits, but plenty of costs as well. Millions of Americans are out of work, and those who have jobs haven’t seen a raise in years. Carr, along with quite a few economists, argues that automation bears part of the blame. It’s hard to compete with a computer.


But our rush to automate imposes other penalties. Even the smartest and most diligent workers can be dumbed down by the digital tools meant to assist them. Carr tells of a British research study of radiologists who used software to scan mammograms for evidence of breast cancer. The doctors were good at spotting obvious tumors, because the software picked them right up. But they were lousy at spotting subtler cases, apparently because they’d come to rely too much on the computer, and not enough on their own eyes and brains.

And of course, there’s aviation, one of the most automated professions on earth. Today’s planes can literally fly themselves, and very often do; the pilots just watch the gauges, ready to take over if something goes wrong.

But are they ready? Carr takes a look at two commercial airline crashes in 2009 that cost the lives of nearly 300 people. Both were blamed on pilot error. Each highly automated plane had experienced a significant but manageable malfunction. But each flight crew had reacted in exactly the wrong way — as if they’d forgotten how to fly. Excessive reliance on automation may have taken a deadly toll.

Carr declares himself a fan of automation; his gripe is that we’re automating the wrong things. Too many systems take total control of the most crucial parts of a task, leaving people to do little more than stare at blinking lights. “Automation weakens the bond between tool and user,” Carr writes, “not because computer-controlled systems are complex but because they ask so little of us.”


He suggests that automated systems should require humans to participate in vital activities. An aircraft autopilot might require the pilot to manually change the plane’s course, altitude, and speed; a medical diagnostic program might run regular quizzes to teach radiologists to spot unusual cancers. And once self-driving vehicles arrive, we might require their human owners to take the wheel every now and then.

Of course, this kind of automation with a human face would be more costly and timeconsuming, making it less likely that businesses will race to embrace it. More likely, we’ll have to tolerate a world of ever smarter machines, operated by ever less capable humans. Not a cheerful prospect, but we can’t say we weren’t warned.

Hiawatha Bray can be reached at hiawatha.bray@globe.com.