That’s not the half of it.

A few years ago, this kind of technological development would be treated like unadulterated good news: an opportunity to improve the nation’s health and standard of living while perhaps even reducing health care costs and achieving a leap in productivity that would cement the United States’ pre-eminent position on the frontier of technology.

But a growing pessimism has crept into our understanding of the impact of such innovations. It’s an old fear, widely held since the time of Ned Ludd, who destroyed two mechanical knitting machines in 19th-century England and introduced the Luddite movement, humankind’s first organized protest against technological change.

In its current incarnation, though, the fear is actually very new. It strikes against bedrock propositions developed over more than half a century of economic scholarship. It can be articulated succinctly: What if technology has become a substitute for labor, rather than its complement?

As J. Bradford Delong, a professor of economics at the University of California, Berkeley, wrote recently, throughout most of human history every new machine that took the job once performed by a person’s hands and muscles increased the demand for complementary human skills — like those performed by eyes, ears or brains.

But, Mr. Delong pointed out, no law of nature ensures this will always be the case. Some jobs — nannies, say, or waiting tables — may always require lots of people. But as information technology creeps into occupations that have historically relied mostly on brainpower, it threatens to leave many fewer good jobs for people to do.