McDonald’s and Walmart, the two biggest private-sector employers in the U.S., don’t pay their workers much. This more or less eternal truth is making one of its increasingly frequent appearances in the news this week. McDonald’s is catching flak for a “sample monthly budget” for employees that sets aside $20 a month for health insurance and no money at all for heat. (Hey, it’s July.) Walmart, meanwhile, is threatening to cut back on plans to open stores in Washington, D.C., after the D.C. council voted to impose a “super minimum wage” of $12.50 an hour on big retailers.

For decades, most discussions of pay levels and income disparity in the U.S. have been accompanied by a pronounced economic fatalism. Pay is set by the market and the labor market has gone global, the reasoning goes — and when a Chinese or Mexican worker can do what an American can for less, wages have to go down. In explaining what’s happened to autoworkers, say, that story makes some sense (although it doesn’t explain why German autoworkers have for the most part kept their high pay and their jobs while Americans haven’t).

But McDonald’s burger-flippers and Walmart checkout clerks can’t be replaced by overseas workers. Instead, both companies were able to build low pay into their business models from the beginning — McDonald’s because so much of its workforce was made up of living-at-home teenagers who did not in fact have to pay for heat, Walmart because of its roots in small Southern towns where wages were low and “living wage” laws unheard of. Now McDonald’s is increasingly staffed by grownups (teens have gone from from 45% of its workforce in the 1990s to 33% recently), while Walmart is trying to conquer the big cities of the North. Both companies have been understandably loath to depart from their low-pay traditions, so conflict and criticism are pretty much inevitable. Which is an extremely healthy development.

That’s because it’s becoming clear that pay levels aren’t entirely set by the market. They are also affected by custom, by the balance of power between workers and employers, and by government regulation. Early economists understood that wage setting was “fundamentally a social decision,” Jonathan Schlefer wrote on HBR.org last year, but their 20th century successors became fixated on the idea of a “natural law” that kept pay in line with productivity. And this idea that wages are set by inexorable economic forces came to dominate popular discourse as well.

Since 1980, though, overall pay and productivity trends have sharply diverged in the U.S.. And since the 1990s, research on the impact of minimum wage laws has demonstrated that there clearly is some distance between the textbook versions of how wages are set and how it happens in reality. It’s not that minimum wage laws work miracles, but they also don’t have nearly the downward effect on employment levels that a pure supply-demand model would predict. Not to mention that decades of research at the organizational and individual level have shown the link between pay and on-the-job performance to be extremely tenuous.

If pay levels at Walmart, McDonald’s, and elsewhere are at least to some extent a societal choice rather than the natural outcome of economic law, it raises a lot of interesting questions. One is whether the doldrums the U.S. economy has found itself in since the early 2000s might to at least some extent have been inflicted by corporate executives committed to keeping labor costs down. In 1914, Henry Ford famously more than doubled wages at his factories, mainly to fight attrition but also so that Ford’s assembly line workers could afford to buy the cars they were making. By that standard, McDonald’s and Walmart are doing okay — their workers can afford to buy their (remarkably inexpensive) products. But Ford’s workers could buy a lot of other things, too, and they and their counterparts at other automakers went on to form the bulwark of a giant new American middle class that helped drive economic growth for decades.

Economic analysis of Ford’s decision has focused on the efficiency gains of paying higher-than-market wages — less turnover and more-productive workers led to higher profits and higher market share, the reasoning goes. That in itself is a big deal. But the even bigger argument that by raising wages Ford might have led a shift in societal norms that put more money in average Americans’ pockets, thus boosting consumer spending and economic growth, hasn’t had much appeal to mainstream economists in the U.S..

In fact, most of the interest has instead been in how hyperefficient operations like McDonald’s and Walmart boost living standards by delivering their products to consumers at ever-lower cost. A few years ago, Jason Furman — recently tapped to become Chairman of President Obama’s Council of Economic Advisers, argued that Walmart was a “progressive success story” because it had driven retail prices down so much. “Even if you grant that Wal-Mart hurts workers in the retail sector — and the evidence for this is far from clear,” he wrote, “the magnitude of any potential harm is small in comparison.”

It’s a provocative argument, and it might even be right. But it’s unlikely that it’s the whole story. For all its productivity innovations, Walmart has also been a key player in a “race to the bottom” that has tamped down wages and dismantled worker protections in the U.S. in recent decades. It’s at least worth asking if the economy would be better off with a race in the opposite direction.

The most outspoken and visible (visible to me, at least) proponent of this view over the past couple of years has been, interestingly enough, Business Insider editor-in-chief Henry Blodget. As he put it on May Day this year, corporate America’s penchant for putting short-term shareholder interests above those of workers “is actually starving the rest of the economy of revenue growth.” Can Blodget prove this? No. Is it a valid topic for economic research and political debate that ought to be getting more attention? You betcha.