Some argue that it’s because today’s technologies are not nearly as impressive as we think. The leading proponent of that view, Northwestern University economist Robert Gordon, contends that compared with breakthroughs like indoor plumbing and the electric motor, today’s advances are small and of limited economic benefit. Others think productivity is in fact increasing but we simply don’t know how to measure things like the value delivered by Google and Facebook, particularly when many of the benefits are “free.”

Both views probably misconstrue what is actually going on. It’s likely that many new technologies are used to simply replace workers and not to create new tasks and occupations. What’s more, the technologies that could have the most impact are not widely used. Driverless vehicles, for instance, are still not on most roads. Robots are rather dumb and remain rare outside manufacturing. And AI is mysterious for most companies.

We’ve seen this before. In 1987 MIT economist Robert Solow, who won that year’s Nobel Prize for defining the role of innovation in economic growth, quipped to the New York Times that “you can see the computer age everywhere but in the productivity statistics.” But within a few years that had changed as productivity climbed throughout the mid and late 1990s.

What’s happening now may be a “replay of the late ’80s,” says Erik Brynjolfsson, another MIT economist. Breakthroughs in machine learning and image recognition are “eye-popping”; the delay in implementing them only reflects how much change that will entail. “It means swapping in AI and rethinking your business, and it might mean whole new business models,” he says.