The Politically Incorrect Guide to the Great Depression and the New Deal

Since late 2007, more and more commentators have drawn parallels between our current financial crisis and the Great Depression. Nobel laureates and presidential advisors confidently proclaim that it was Herbert Hoover’s laissez-faire penny pinching that exacerbated the Depression, and that the American economy was saved only when FDR boldly ran up enormous deficits to fight the Nazis. But as I document in my new book, The Politically Incorrect Guide to the Great Depression and the New Deal, this official history is utterly false.

Let’s first set the record straight on Herbert Hoover’s fiscal policies. Contrary to what you have heard and read over the last year, Hoover behaved as a textbook Keynesian after the stock market crash. He immediately cut income tax rates by one percentage point (applicable to the 1929 tax year) and began ratcheting up federal spending, increasing it 42 percent from fiscal year (FY) 1930 to FY 1932.

But to truly appreciate Hoover’s Keynesian bona fides, we must realize that this enormous jump in spending occurred amidst a collapse in tax receipts, due both to the decline in economic activity as well as the price deflation of the early 1930s. This combination led to unprecedented peacetime deficits under the Hoover Administration—something FDR railed against during the 1932 campaign!

How big were Hoover’s deficits? Well, his predecessor Calvin Coolidge had run a budget surplus every single year of his own presidency, and he held the federal budget roughly constant despite the roaring prosperity (and surging tax receipts) of the 1920s. In contrast to Coolidge—who was a true small-government president—Herbert Hoover managed to turn his initial $700 million surplus into a $2.6 billion deficit by 1932.

It’s true, that doesn’t sound like a big number today; Henry Paulson handed out more to bankers by breakfast. But keep in mind that Hoover’s $2.6 billion deficit occurred because he spent $4.6 billion while only taking in $2 billion in tax receipts. Thus, as a percentage of the overall budget, the 1932 deficit was astounding—it would translate into a $3.3 trillion deficit in 2007 (instead of the actual deficit of $162 billion that year). For another angle, I note that Hoover’s 1932 deficit was 4 percent of GDP, hardly the record of a Neanderthal budget cutter.

The real reason unemployment soared throughout Hoover’s term was not his aversion to deficits, or his infatuation with the gold standard. No, the one thing that set Hoover apart from all previous US presidents was his insistence to big business that they not cut wage rates in response to the economic collapse. Hoover held a faulty notion that workers’ purchasing power was the source of an economy’s strength, and so it seemed to him that it would set in motion a vicious cycle if businesses began laying off workers and slashing paychecks because of slackening demand.

The results speak for themselves. During the heartless “liquidationist” era before Hoover, depressions (or “panics”) were typically over within two years. Yes, it was surely no fun for workers to see their paychecks shrink quite rapidly, but it ensured a quick recovery and in any event the blow was cushioned because prices in general would fall too.

So what was the fate of the worker during the allegedly compassionate Hoover era, when “enlightened” business leaders maintained wage rates amidst falling prices and profits? Well, Econ 101 tells us that higher prices lead to a smaller amount purchased. Because workers’ “real wages” (i.e. nominal pay adjusted for price deflation) rose more quickly in the early 1930s than they had even during the Roaring Twenties, businesses couldn’t afford to hire as many workers. That’s why unemployment rates shot up to an inconceivable 28 percent by March 1933.

“This is all very interesting,” the skeptical reader might say, “but it’s undeniable that the huge spending of World War II pulled America out of the Depression. So it’s clear Herbert Hoover didn’t spend enough money.”

Ah, here we come to one of the greatest myths in economic history, the alleged “fact” that US military spending fixed the economy. In my book I relied very heavily on the pioneering revisionist work of Bob Higgs, who has shown in several articles and books that the US economy was mired in depression until 1946, when the federal government finally relaxed its grip on the country’s resources and workers.

Sure, unemployment rates dropped sharply after the US began drafting men into the armed forces. Is that so surprising? By the same token, if Obama wanted to reduce unemployment today, he could take two million laid-off workers, equip them with arm floaties, and send them to fight pirates. Voila! The unemployment rate would fall.

The official government measures of rising GDP during the war years is also misleading. GDP figures include government spending, and so the massive military outlays were lumped into the numbers, even though $1 million spent on tanks is hardly the same indication of true economic output as $1 million spent by households on cars.

On top of that distortion, Higgs reminds us that the government instituted price controls during the war. Normally, if the Fed prints up a bunch of money to allow the government to buy massive quantities of goods (such as munitions and bombers, in this case), the CPI would go through the roof. Then when the economic statisticians tabulated the nominal GDP figures, they would adjust them downward because of the hike in the cost of living, so that “inflation adjusted” (real) GDP would not look as impressive. But this adjustment couldn’t occur, because the government made it illegal for the CPI to go through the roof. So those official measures showing “real GDP” rising during World War II are as phony as the Soviet Union’s announcements of industrial achievements.

If the Keynesians rely on bad economic theory, and misleading history, to justify their calls for huge government deficits, the Chicago School monetarists are hardly better when they call for interest rates at zero percent (or even negative!) and blame the Depression on the Fed’s lack of willpower.

In doing research for the book, what I noticed is that from the time it opened its doors in November, 1914, all the way through 1931, the New York Fed charged its record-low rates at the very end of this period. The “discount rate” was the interest rate the Fed banks would charge on collateralized loans made to member banks. For the New York Fed, rates had bounced around since its founding, but they were never higher than 7 percent and never lower than 3 percent, going into 1929.

This changed after the stock market crash. On November 1, just a few days following Black Monday and Black Tuesday—when the market dropped almost 13 percent and then almost 12 percent back-to-back—the New York Fed began cutting its rate. It had been charging banks 6 percent going into the Crash, and then a few days later it slashed by a full percentage point. Then, over the next few years, the New York periodically cut rates down to a record-low of 1½ percent by May 1931. It held the rate there until October 1931, when it began hiking to stem a gold outflow caused by Great Britain’s abandonment of the gold standard the month before. (Worldwide investors feared the US would follow suit, so they started dumping their dollars while the American gold window was still open.)

So far my story doesn’t sound unusual. “Everybody knows” that the Fed is supposed to slash rates to ease liquidity crunches during a financial panic. It helps to ease the crisis, and provides a softer landing than if the supply of credit were fixed.

But guess what? Throughout the period we are considering, the highest the New York Fed ever charged banks was 7 percent. And the only time it did that was smack dab in the middle of the 1920-1921 depression.

Although you’ve probably never heard of it, this earlier depression was quite severe, with unemployment averaging 11.7 percent in 1921. Fortunately it was over fairly quickly; unemployment was down to 6.7 percent in 1922, and then an incredibly low 2.4 percent by 1923.

After working on these issues for my book, it suddenly became obvious to me: The high rates of the 1920-1921 depression had certainly been painful, but they had cleaned the rot out of the structure of production very thoroughly. The US money supply and prices had roughly doubled during World War I, and the record-high discount rate starting in June 1920 was a pressure washer on the malinvestments that had festered during the war boom.

Going into 1923, the capital structure in the United States was a lean, mean, producing machine. In conjunction with Andrew Mellon’s incredible tax cuts, the Roaring ’20s were arguably the most prosperous period in American history. It wasn’t merely that the average person got richer. No, his life changed in the 1920s. Many families acquired electricity and cars for the first time during this decade.

In contrast, during the early 1930s, the Fed’s rate cuts “for some reason” didn’t seem to do the trick. In fact, they sowed the seeds for the worst decade in US economic history.

It’s actually easier to see what’s going on if you forget about a central bank, and just pretend that we were living in the good old days when banks would compete with each other and there was no cartelizing overseer. Now in this environment, when a panic hits and most people realize that they haven’t been saving enough—that they wish they were holding more liquid funds right this moment than their earlier plans had provided them—what should the sellers of liquid funds do?

The answer is obvious: They should raise their prices. The scarcity of liquid funds really has increased after the bubble pops, and its price ought to reflect that new information. People need to know how to change their behavior, after all, and market prices mean something.

But in more modern times, thanks not just to Keynes but more important to Milton Friedman, central bankers now think that during the sudden liquidity crunch, they are supposed to shovel their product out the door. But in order to do that, of course, they have to water down its potency. It’s as if a wine dealer suddenly has a rush of customers for a rare vintage of which he only has 3 bottles, and his response is to put the vintage on sale but then dilute it with 9 parts water to 1 part wine. That way he can sell to all the eager customers and not pick their pocket at the same time.

Let’s try a different example: If the owner of a trucking company experiences a huge rush for his services, he might decide to postpone essential maintenance on his fleet, to take advantage of the unprecedented demand. But during this period he will be charging record shipping prices to make it worth his while to deviate from the normal, “safe” way of running his business. He will only be willing to bear the extra risk (either to the safety of his drivers or just the long-term operation of the trucks) if he is being compensated for it.

The same is true for the banks. Just as every other business during a recession wants to bolster its cash reserves, so too with the business that rents out cash reserves. If there’s a hurricane, the stores selling flashlights and generators should raise the prices on those essential items, to make sure they are rationed correctly. The same is true for liquidity, the moment after the community realizes they are in desperate need of it.

Regards,

Robert P. Murphy

for The Daily Reckoning