None of this is to say that automation and AI aren’t having an important impact on the economy. But that impact is far more nuanced and limited than the doomsday forecasts suggest. A rigorous study of the impact of robots in manufacturing, agriculture, and utilities across 17 countries, for instance, found that robots did reduce the hours of lower-skilled workers—but they didn’t decrease the total hours worked by humans, and they actually boosted wages. In other words, automation may affect the kind of work humans do, but at the moment, it’s hard to see that it’s leading to a world without work. McAfee, in fact, says of his earlier public statements, “If I had to do it over again, I would put more emphasis on the way technology leads to structural changes in the economy, and less on jobs, jobs, jobs. The central phenomenon is not net job loss. It’s the shift in the kinds of jobs that are available.”

McAfee points to both retail and transportation as areas where automation is likely to have a major impact. Yet even in those industries, the job-loss numbers are less scary than many headlines suggest. Goldman Sachs just released a report predicting that autonomous cars could ultimately eat away 300,000 driving jobs a year. But that won’t happen, the firm argues, for another 25 years, which is more than enough time for the economy to adapt. A recent study by the Organization for Economic Cooperation and Development, meanwhile, predicts that 9 percent of jobs across 21 different countries are under serious threat from automation. That’s a significant number, but not an apocalyptic one.

Of the 271 occupations listed on the 1950 census only one—elevator operator—had been rendered obsolete by automation by 2010.

Granted, there are much scarier forecasts out there, like that University of Oxford study. But on closer examination, those predictions tend to assume that if a job can be automated, it will be fully automated soon—which overestimates both the pace and the completeness of how automation actually gets adopted in the wild. History suggests that the process is much more uneven than that. The ATM, for example, is a textbook example of a machine that was designed to replace human labor. First introduced around 1970, ATMs hit widespread adoption in the late 1990s. Today, there are more than 400,000 ATMs in the US. But, as economist James Bessen has shown, the number of bank tellers actually rose between 2000 and 2010. That’s because even though the average number of tellers per branch fell, ATMs made it cheaper to open branches, so banks opened more of them. True, the Department of Labor does now predict that the number of tellers will decline by 8 percent over the next decade. But that’s 8 percent—not 50 percent. And it’s 45 years after the robot that was supposed to replace them made its debut. (Taking a wider view, Bessen found that of the 271 occupations listed on the 1950 census only one—elevator operator—had been rendered obsolete by automation by 2010.)

Of course, if automation is happening much faster today than it did in the past, then historical statistics about simple machines like the ATM would be of limited use in predicting the future. Ray Kurzweil’s book The Singularity Is Near (which, by the way, came out 12 years ago) describes the moment when a technological society hits the “knee” of an exponential growth curve, setting off an explosion of mutually reinforcing new advances. Conventional wisdom in the tech industry says that’s where we are now—that, as futurist Peter Nowak puts it, “the pace of innovation is accelerating exponentially.” Here again, though, the economic evidence tells a different story. In fact, as a recent paper by Lawrence Mishel and Josh Bivens of the Economic Policy Institute puts it, “automation, broadly defined, has actually been slower over the last 10 years or so.” And lately, the pace of microchip advancement has started to lag behind the schedule dictated by Moore’s law.

Corporate America, for its part, certainly doesn’t seem to believe in the jobless future. If the rewards of automation were as immense as predicted, companies would be pouring money into new technology. But they’re not. Investments in software and IT grew more slowly over the past decade than the previous one. And capital investment, according to Mishel and Bivens, has grown more slowly since 2002 than in any other postwar period. That’s exactly the opposite of what you’d expect in a rapidly automating world. As for gadgets like Pepper, total spending on all robotics in the US was just $11.3 billion last year. That’s about a sixth of what Americans spend every year on their pets.