One of the old jokes in computing is that what the hardware giveth, the software taketh away.

The biblical vernacular is meant to convey the commandmentlike certainty of a set order of things. And the implication of the old saw is that chip-based progress is torrid and the engine of computing innovation, while the messy, unpredictable process of humans writing code is the laggard — the caboose of the innovation train.

In reporting a Sunday Week in Review piece, I was pointed to research that sharply contradicts the conventional wisdom. It did not find its way into the more general article, but the research was intriguing, I thought, and its implication not widely appreciated.

A report by an independent group of science and technology advisers to the White House, published last December, cited research showing that performance gains in doing computing tasks that result from improvements in software algorithms often far outpace the gains attributable to faster processors.

The rapid improvement in chips, of course, has its own “law” — Moore’s Law, named after the Intel co-founder Gordon Moore, who in 1965 predicted that the density of transistors on integrated circuits would double every 18 months or so. Physics, along with ingenuity and investment, made that forecast of performance-doubling every year and a half accurate so far.

There are no such laws in software. But the White House advisory report cited research, including a study of progress over a 15-year span on a benchmark production-planning task. Over that time, the speed of completing the calculations improved by a factor of 43 million. Of the total, a factor of roughly 1,000 was attributable to faster processor speeds, according to the research by Martin Grotschel, a German scientist and mathematician. Yet a factor of 43,000 was due to improvements in the efficiency of software algorithms.

The rate of change in hardware captured by Moore’s Law, experts agree, is an extraordinary achievement. “But the ingenuity that computer scientists have put into algorithms have yielded performance improvements that make even the exponential gains of Moore’s Law look trivial,” said Edward Lazowska, a professor at the University of Washington.

The rapid pace of software progress, Mr. Lazowska added, is harder to measure in algorithms performing nonnumerical tasks. But he points to the progress of recent years in artificial intelligence fields like language understanding, speech recognition and computer vision as evidence that the story of the algorithm’s ascent holds true well beyond more easily quantified benchmark tests.

What explains the extraordinary pace of software improvement? The answers are doubtless many and far more complex than I’m going to be able (or capable) of addressing here. But I’d suggest two thoughts as high-level explanations.

First, software’s perceived weakness is a strength. It can be messy and chaotic, because it is pure abstraction — a building material without material constraints. That messy, chaotic structure allows more point of entry for innovation.

Second, my first point may be true, but you don’t get hyper-speed innovation in software without plenty of rapid improvement in the underlying hardware as well. There is a lot to the notion of the yin and yang of computing, software and hardware inextricably linked — even if the old saw misstates the relationship between the two.