Financial companies have leveraged their power to become even bigger through consolidation, thereby enabling them to divert more capital from productive use. But even during the “boom” portion of business cycles financiers are destructive to an economy. (Photo: Daniel Foster / Flickr)

This text is an excerpt from It’s Not Over: Learning From the Socialist Experiment, published in February 2016 by Zero Books. It’s Not Over is a study of the 20th century’s attempts to go beyond capitalism in order to understand today’s world and find a better world tomorrow.

Financial bubbles followed by crashes have long been a part of capitalism. “Tulip mania” consumed the Dutch in the 1630s, speculation fueled by the first futures contracts; uncontrolled speculation in the 1710s in the English South Sea Company and the French Company of the Indies led to the collapse of stock in both, a bubble in which short selling was born; an 1830s bubble in US real estate burst when banks stopped making payments; and an 1870s bubble inflated by speculation in railroads and construction in North America and Europe burst when the Vienna stock market crashed, followed by waves of bank failures, to note some of the more well-known examples.

Bubbles occur when financial speculation becomes more profitable than manufacturing products. The amount of money that flows upward to the rich becomes larger than they can use for personal luxury consumption or investment in their businesses; these torrents of money are diverted into increasingly risky pure speculation. Too much money comes to chase too few assets, rapidly bidding up prices until there is no possible revenue stream that can sustain the price of assets bought at inflated levels.

(Image: Zero Books)An economic collapse on the scale of what occurred in 2008 is a culmination of a long sequence of events, and this one has roots in the early 1970s, when neoliberal policies began to be adopted. Keynesianism, the belief that capitalism is unstable and requires government intervention in the economy when private enterprise is unable to spend enough to lift it out of a slump, had been in the ascendancy among economists since the 1930s. Sustained organized unrest during that decade caused governments such as the Roosevelt administration to fear the possibility of revolution and, in response, massively increase social spending to dampen that unrest. The unprecedented government spending required to win World War II pulled the West out of the Great Depression, and the United States government continued spending to rebuild Europe and Japan through the Marshall Plan, successfully expanding markets for US products.

The Keynesian compromise was not necessarily what capitalists would have wanted; it was a pragmatic decision — profits could be maintained through expansion of markets and social peace bought. This equilibrium, however, could only be temporary because the new financial center of capitalism, the US, possessed a towering economic dominance following World War II that could not last. When markets can’t be expanded at a rate sufficiently robust to maintain or increase profit margins, capitalists cease tolerating paying increased wages.

Ironically, just at the time when the conservative Richard Nixon declared “We are all Keynesians now” is when the tide turned against that school of economic thought. The rebound of Western Europe and Japan eroded US manufacturing dominance, squeezing corporate profits and intensifying competition, and US manufacturers responded by moving production overseas. A steady loss of well-paying jobs became a hammer held above the heads of working people — industrialists found it easier to squeeze their employees.

By the early 1970s, the Nixon administration believed that the Bretton Woods monetary system put in place during World War II no longer sufficiently advantaged the United States despite its currency’s centrality within the system cementing US economic suzerainty. Under Bretton Woods, the value of a US dollar was fixed to the price of gold, and the value of all other currencies was pegged to the dollar. Any government could exchange the dollars it held in reserve for US Treasury Department gold on demand. Rising world supplies of dollars and domestic inflation depressed the value of the dollar, causing the Treasury price of gold to be artificially low and thereby making the exchange of dollars for gold at the fixed price an excellent deal for other governments. The Nixon administration refused to adjust the value of the dollar, instead in 1971 pulling the dollar from the gold standard by refusing to continue to exchange foreign-held dollars for gold on demand. Currencies would now float on markets against each other, their values set by speculators rather than by governments, making all but the strongest countries highly vulnerable to financial pressure.

The world’s oil-producing states dramatically raised oil prices in 1973. The Nixon administration eliminated US capital controls a year later, encouraged oil producers to park their new glut of dollars in US banks and adopted policies to encourage the banks to lend those deposited dollars to the South. Restrictions limiting cross-border movements of capital were opposed by multinational corporations that had moved production overseas, by speculators in the new currency-exchange markets that blossomed with the breakdown of Bretton Woods and by neoliberal ideologues, creating decisive momentum within the US for the elimination of capital controls. Private banks quickly became the center of international finance in place of central banks, leading to international dominance of the US and British financial systems and US financial institutions.

Margaret Thatcher in Britain and Ronald Reagan in the US ascended to office determined to tighten this domination, a project that would require both deregulation and lower standards of living for working people. It is no accident that the first move of the Thatcher administration, upon taking office in 1979, was to eliminate British capital controls (further stimulating financial speculation) and later maneuvering to break the miners’ union (striking a decisive blow against working people’s ability to defend themselves). Similarly, when Reagan took office at the start of 1981, he deregulated US banking and broke the air traffic controllers’ national strike. (The ability of Thatcher and Reagan to break such strikes was strongly augmented by the lack of solidarity from workers in other industries; their reward for their silence would be to come under attack themselves, further eroding living standards.)

Although Thatcherites provided ideological ballast for Reaganites, it would be the far larger US, and Reagan administration policy, that would be decisive. The Reagan administration severely tightened monetary policy to squeeze out inflation; gave huge tax cuts to the rich, thereby providing a correspondingly large boost to the financial industry (because the windfalls of the rich would inevitably be put to use in speculation); and pursued a policy of a highly valued dollar. The extraordinarily high interest rates offered by US banks attracted foreign capital, financing the Reagan administration’s deficit spending and military buildup; in turn the US applied pressure on other countries to loosen their capital controls to enable this flow of funds into the US. At the same time, oil was paid for in dollars internationally; a combination of high oil prices and a highly valued dollar triggered the debt crisis of the 1980s. Latin American payments to service debt increased from less than a third of the value of the region’s exports in 1977 to almost two-thirds in 1982, a graphic illustration of the grip of finance capital.

Mid-twentieth-century Keynesianism depended on an industrial base and market expansion. A repeat of history isn’t possible because the industrial base of the advanced capitalist countries has been hollowed out, transferred to low-wage developing countries, and there is almost no place remaining to which to expand. Moreover, capitalists who are saved by Keynesian spending programs amass enough power to later impose their preferred neoliberal policies. A vicious circle arises: Persistent unemployment and depressed wages in developed countries and inadequate ability to consume on the part of underpaid workers in developing countries leads to continuing under-consumption, creating pressure for still lower wages by capitalists who can’t sell what they produce and seek to cut costs further because there is no incentive for them to invest in new production.

With no apparent way out of ongoing stagnation, the governments of the world’s advanced capitalist countries have had no answer other than to prop up the financialization that led to the collapse. But another reason is that the financial industry has become so powerful; post-collapse government spending has mostly gone to bail out financiers rather than for investment.

Financial companies, having extracted immense sums of bailout money, have leveraged their power to become even bigger through consolidation, thereby enabling them to divert more capital from productive use. But even during the “boom” portion of business cycles financiers are destructive to an economy by rewarding manufacturers for mass layoffs, moving production to low-wage developing countries with few or no effective labor laws, and setting up subsidiaries overseas and using creative accounting to shift profits offshore to avoid paying taxes. Financiers provide rewards for such behavior in the form of rising stock prices, and those stock prices in turn provide top executives a rationale to give themselves stratospheric pay packages because they “enhanced shareholder value.”

Underlying this ideology is that industrialists want out of paying pensions and financiers want larger pools of money with which to play. This process creates the illusion that everybody’s interest is tied to the stock market, even when their company’s stock has risen in price precisely because they have just lost their jobs. These illusions can be pushed to the point where employees are said to be in charge of their company. In reality, the dynamics of capital accumulation continue unperturbed even when formal “ownership” is handed over to so-called “employee stock-ownership plans.”

An excellent example of the hollowness of capitalist “ownership society” ideology — of finance capital’s “efficiency” in extracting money and of how financiers and industrialists work together to enrich themselves at the expense of working people — is provided by the Tribune Company. One of the largest media conglomerates in the United States, the company was bought by a real estate tycoon in December 2007 for more than $8 billion, but the tycoon did so with only $315 million of his own money. The purchase was financed by using the company’s employee stock-ownership plan to buy all of Tribune’s stock. But that did not mean the supposed new “owners” actually controlled anything.

“That made the employees the titular owners of the company, but they had no say in the matter and have no control over its management,” a New York Times report said. “They were promised the possibility of gaining access to their shares and cashing them in several years in the future, but in a bankruptcy, share values are often wiped out.” The company’s debt was tripled to finance this scheme, causing it to file for bankruptcy within a year. Holders of common stock — in this case, the stock plan — are wiped out in a bankruptcy because they are, legally, the owners of the company. To pay off the debt resulting from the sale to the tycoon, Tribune’s management cut back on its pension contributions.

Who benefited from this deal? The tycoon structured the deal so that he would be in line with the company’s creditors for repayment, despite the fact that he used other people’s money to buy the company. Two major investment banks were paid $72 million for “advising” Tribune’s board of directors, which voted to sell the company to the tycoon and authorized the raid on the employee stock plan. The same two investment banks were among those sharing another $47 million in fees for lending the employee stock-ownership plan money that the tycoon used to finance the purchase. A third investment bank was paid $10 million for issuing an “opinion” that the deal was “fair.” Finally, the Tribune chief executive was paid $41 million to give his approval of the sale, after which he left the company.

In the first two years of the tycoon’s control, thousands of employees were laid off and the bankruptcy announcement was delivered with a statement to employees that “all ongoing severance payments … have been discontinued.” Tribune’s management then decided it should pay its 700 top managers $66 million in “retention” bonuses, $9.3 million of which would go to a mere 23 executives — these bonuses would consume one-sixth of the company’s operating cash flow.

It is very difficult to imagine these would have been the results if the employees actually owned this company.

Truthout takes zero advertising money — instead we rely on readers to sustain our site. Will you join the thousands of people who fund our work? Make a donation by clicking here!