Life After Debt

Over the past few weeks, the world’s public debt crisis, simmering for months, has come to the boil. When the problems were confined to small countries such as Greece and Ireland, it was assumed that any fallout could be contained. Now, however, the crisis has threatened to engulf nearly everyone. The high-wire confrontation over the debt ceiling in the U.S. Congress raised the prospect of a default by the world’s biggest borrower. At the same time, the markets turned their attention to Italy, the eurozone’s third-largest economy and the world’s fourth-biggest debtor, threatening to raise its borrowing costs to unaffordable levels, or even to cut off its access to funds. The two crises differed in many ways — not least in that America’s borrowing costs fell while Italy’s rose — but the outcomes were similar in one respect: both countries have enacted plans to sharply cut their budget deficits.

How different it seems from two years ago. In the wake of the 2008 financial crisis, the ideas of John Maynard Keynes, the early 20th century British economist who came to fame during the Great Depression, reigned supreme. It was almost universally accepted that his prescription of massive doses of deficit spending constituted the only possible cure for the global economic collapse. But although large-scale government stimulus programs averted economic catastrophe, apparently justifying Keynes’s theories, it now seems that the Keynesian plan to rescue the global economy is being left half-baked. His ideas are being abandoned even though unemployment remains far above pre-crisis levels and the economic recovery is stalling. Keynesian economists and politicians may describe their austerity-minded opponents as turkeys voting for Christmas, but they appear to be losing the battle.

Is this just a moment of collective folly, a wilful blindness to the lessons of the past? To Keynesians, after all, the historical record is clear. Misguided attempts to balance the budget in the wake of the 1929 crash turned a nasty recession into the Great Depression. It was only when the government started to run a substantial deficit from 1932 onwards that the slump abated and the economy recovered, aided by President Franklin D. Roosevelt’s devaluation of the dollar in 1933. But the recovery was aborted while unemployment was still high, as a result of the premature withdrawal of fiscal and monetary stimulus in 1937. A sharp and unnecessary second recession followed in 1938, and full employment was only restored by the massive additional stimulus provided by war spending, after which Keynesian economic doctrines produced a period of almost uninterrupted growth that lasted until the 1970s.

The recession of 1938 is a pivotal event in this historical narrative, because it seems to parallel so closely the present situation. By attempting to balance the budget before the economic recovery is fully established, the West risks a double-dip recession, just as occurred in the 1930s.

Yet this story is not quite as simple as Keynesians would like to think — and the events of 1938 are not the only historical example that can be brought to bear on current events. If Keynesians can point to the impact of wartime spending on the economy, austerity advocates can point to the retreat from it, after both world wars. In 1918 and 1945, both the United States and Britain found themselves with very high public debts and economies that had been artificially boosted during the war as a result of deficit spending and loose monetary policies. Their average budget deficit in the last year of war was 25 percent of gross domestic product (GDP). Yet within two years after the end of the wars, both countries had returned not just to sustainable levels of deficit, but to surplus. This was a far greater level of fiscal tightening than anything contemplated nowadays, and it was achieved exclusively through spending reductions.

The outcomes of these post-war retrenchments are instructive. In three out of the four cases of British and American post-war adjustment, the economies initially shrunk, but then started a period of strong and sustained growth with low unemployment. (The exception is Britain after World War I, which entered a decade-long economic depression in many ways as severe as America’s in the 1930s. The difference here is in monetary policy: While the United States countered post-war inflation with interest rate hikes that brought prices back to 1919 levels but no lower, Britain made a concerted attempt to deflate prices to pre-war levels so as to get back onto the gold standard at the old parity. In other words, it attempted an "internal devaluation" like the one now being prescribed for the uncompetitive peripheral countries of the eurozone — and the result was disastrous.)

The post-war experience appears to offer some comfort for America and Britain — if not for the eurozone. It seems that even extreme fiscal contractions can be pursued without long-term harm as long as monetary policy is left easy and deflation is avoided. After the wars there was an inevitable period of difficult adjustment as the economy underwent a change in focus, reducing its dependence on military spending. But once that adjustment was endured, economies rebounded rapidly. This was all the more remarkable because between them, the Allies comprised close to half of the world’s GDP, so there was no hope of exporting to some "consumer of last resort."

But there are two big reasons to think austerity will not work in 2011 the way it did in 1919 and 1945. The first is political. In the aftermath of the world wars there was a near universal acceptance of the need to dismantle the wartime economy, even if it involved short-term costs. After 1945, military spending was replaced as the largest drain on public resources by welfare spending. Unlike the warfare state, the welfare state has established deep roots and myriad stakeholders; there is little consensus about whether, let alone how, to cut it back.

The second reason is economic: The world’s debt portfolio today looks almost nothing like it did then. Public debt may have risen dramatically during both wars, but at the same time private sector debt shrank — not least because private borrowers were excluded from the capital markets for the duration of the war. The effect was most pronounced in World War II, during which private-sector debt shrank from 130 percent of GDP in 1940 to a 20th-century low of 65 percent in 1945. That decline completely offset the rise of public debt from 60 percent to 125 percent of GDP over the same period. With such a low level of private debt, it was scarcely surprising that the post-war economy was primed for a recovery, as pent-up demand for consumer goods provided a ready market for industries retooling from war production.

But private-sector debt is so immense today that it is almost inconceivable that such a benign economic outcome will unfold in the face of fiscal austerity. By 2007, private debt in the United States had reached an astonishing 300 percent of GDP — two to three times what it was before the Great Depression and from the 1950s to the 1980s.

The Keynesian prescription for the current economic crisis entails the government keeping the economy afloat until the painful process of deleveraging is accomplished, and consumers — their debt obligations reduced — can once again take their place as the engine of growth. If this means that government debt rises, that is fine, as long as the rise is offset by a comparable fall in private sector debt. So far, this has occurred in the current crisis much as it did during the world wars: Public sector debt has risen by 30 percent of GDP since the end of 2007, while private sector debt has fallen by a similar amount.

This process should be allowed to continue. After all, the federal debt held by the public is currently 65 percent of GDP — far lower than 1945 levels. But is such an escalation of public debt safe, or even possible? Private sector debt still stands at a near record 270 percent of GDP. How far does it need to fall to be considered stable? No one knows. We have been living through, and are now probably witnessing the end of, an era with no historical parallels: what might be described as the "great debt experiment."

There have been three ingredients in this experiment. The first was the introduction of Keynesian economics itself. Before that time, there were only two accepted reasons for public borrowing: financing wars and, starting in the 19th century, financing infrastructure projects, in particular railways. These kinds of deficit spending could be variously justified as a necessary response to a national emergency or as a good investment. Keynesian economics not only increased the amount of public debt over time, but also introduced a completely new rationale for running deficits that eroded the traditional disapproval of borrowing to finance pure consumption. In the new jargon, public deficits were not really borrowing at all, but merely an internal accounting procedure to boost purchasing power.

The heyday of Keynesian economics came to an end in the stagflation of the 1970s. But curiously, budget deficits actually grew after Keynesianism fell from favor — not only in the United States, but throughout the Western world. The explanation lies partly in a covert acceptance of deficit spending even by governments nominally hostile to Keynesian doctrine, but also in part in the increasing pressures on public spending created by the second ingredient in the great debt experiment: unfunded long-term financial promises to voters.

The post-war era witnessed not only the triumph of Keynesian economics, but also the establishment of public pensions throughout the Western world. Almost all these pension plans were set up on a pay-as-you-go basis that provided high rates of return to the first generation of pensioners (which, perhaps not coincidentally, was the generation that voted them into existence) at the cost of an unfunded commitment to later generations. Public pension plans are the biggest element in the off-balance-sheet obligations of states, which also include unfunded health-insurance liabilities and the 2008 guarantees to the banking system. In most countries these "implicit" public debts dwarf their traditional obligations traded in the bond market. In the United States, the total long-term commitments for Social Security, public sector pensions, and Medicare have been estimated at over 300 percent of GDP on the basis of current policies.

The third ingredient in this intoxicating debt brew has been the extraordinary rise in private-sector borrowing since the 1980s. This growth was made possible by the interaction of a number of widely held beliefs. The first was the notion that everyone should have access to credit, partly in the interests of social justice and partly in the interests of general economic prosperity. Such credit could be made available because of the belief that, properly structured, the debts of traditionally uncreditworthy borrowers were as sound as anyone else’s. This idea was part of a broader conviction that advances in financial technique were increasing the amount of debt that the economy could sustain with safety. A new generation of mathematically trained bankers — quants — was dreaming up instruments such as structured credit vehicles and complex derivatives that contained and dispersed risk. And a new generation of wise central bankers had learned the lessons of the deflationary 1930s and the inflationary 1970s and were now able to guide the economy into the "great moderation" of low and stable inflation and interest rates.

By the early 2000s, it seemed that the solution for almost every problem in the Western world, personal as well as macroeconomic, was to borrow. Not anymore. The message received from both markets and voters Europe and in America is that the era of ever-higher debt is over.

Indeed, one of the most striking aspects of the eurozone crisis is that bond markets have not discriminated between causes of excessive debt. Greece was denied credit and had to go begging to Brussels for a bailout, not because it had taken part in the real estate bubble but because it had abused entry to the eurozone to enjoy a public borrowing spree. Ireland was denied credit because, even though its public finances were in solid shape, it had allowed its banks to overwhelm them. Italy is perhaps the most remarkable case of all. It is now threatened with loss of credit, not because of any post-euro borrowing, nor because of its current budget deficit (which is not much higher than Germany’s). Rather, it is being punished for sins committed in the 1980s and early 1990s when it built up its public debt to levels that the markets have suddenly decided are unsustainable. What we are seeing, in other words, is a wholesale revision of the rules about debt that have held true for decades.

The markets have highlighted a fundamental shortcoming in Keynes’s ideas: He assumed that governments would always be able to borrow. If they cannot, then Keynesian economics is dead in the water. In the European periphery, the markets have preempted the austerity debate by refusing to lend. But even where markets have not forced the issue, voters have been taking matters into their own hands. Germany may have enacted stimulus measures in 2008, but it followed that by adopting a stringent deficit reduction program in 2009, including a balanced budget amendment to the constitution. Britain’s Conservatives, who had been wandering in the political wilderness for more than a decade, seemed out of the mainstream when they proposed cutting the deficit in 2009. The following year, the party was voted into power amid a debate that changed the political climate so dramatically that all parties were soon proposing relatively similar austerity programs. The United States, the last holdout of fulsome deficit spending under President Barack Obama, lurched sharply rightward in the 2010 congressional election. Now, as in Britain, what originally seemed like a minority opinion in favor of fiscal austerity has become accepted policy, with congressional Republicans bent on slashing the budget at any cost and the White House’s Keynesian voices now drowned out by the administration’s chorus of deficit hawks.

It is not clear that the bond markets or voters who have called an end to Keynesianism have a clear vision of what lies ahead. The former know only that they are no longer willing to lend, the latter that they are no longer willing to borrow. The great debt experiment has left the Western world with a problem that has no easy solution. The outcome is hard to predict, but in the end it is likely to involve the reduction of both private and public debts to levels that the markets consider sustainable — whether by debt write-offs or through inflation. One thing seems clear: For the first time in decades, borrowing will not form part of the solution.