The headlines are alarming. The New York Times panicked that Americans are “Running in Debt” and just a few years later warned that Americans were “Borrowing Trouble.” Business Week asked, “Is the Country Swamped with Debt?” and U.S. News and World Report worried that “Never Have So Many Owed So Much.” Harper’s even expressed fear that “Debt Threatens Democracy.”

A labor leader bemoaned the improvidence of America’s consumers: “Has not the middle class its poverty? Very few among them are saving money. Many of them are in debt; and all they can earn for years, is, in many cases, mortgaged to pay such debt.”

An academic report concluded that consumers’ promiscuous borrowing has “‘lured thousands to ruin’ encouraging people to buy what they could not pay for and making debt ‘the curse of countless families.’” And not merely the poor and improvident were lured into ruin, but upstanding middle-class families as well, as they engaged in a heated rivalry of conspicuous consumption with their neighbors.

An indictment of our times? Not exactly. The first headline from The New York Times, as well as the labor leader’s concerns, were both from 1873, and the latter Times headline from 1877. The academic report appeared in 1899 and criticized the availability of installment credit, or the practice of buying consumer goods “on time.” Thorstein Veblen voiced his concerns about “conspicuous consumption” and Americans’ willingness to go into hock to fund it in 1899. The Business Week and U.S. News and World Report headlines ran in 1959. And Harper’s fretted that “Debt Threatens Democracy” in 1940.

As these evergreen headlines suggest, three facts of American life appear constant: First, consumer credit is ubiquitous in America; second, at least some Americans have always gotten in over their heads with credit; and third, an omnipresent chorus wails that other people are using consumer credit excessively to buy things that they shouldn’t want or can’t afford. Finally, every era has complained that everybody was thriftier in “the old days,” a mindset that author Lendol Calder has referred to as the “myth of lost financial virtue.” The massive credit-induced bubble in the real estate market over the past decade and the subsequent crash have led to a reprise of these time-tested themes–and a predictable move toward more government regulation.

And, indeed, there was undoubtedly a credit-driven bubble in home prices that has popped with catastrophic effect. But exploding home prices and an expansion of risky real estate lending should be distinguished from trends in consumer credit. Even during the bubble years, over 80 percent of home mortgage debt was for home purchase, home improvements, or other residential real estate, with only about 7.7 percent going for the purchase of goods and services, according to a 2009 report in the Federal Reserve Bulletin. Conventional wisdom holds that this growth in mortgage lending was just part of a larger growth in promiscuous consumer borrowing in recent years. But the reality is more complex and interesting. In fact, nonmortgage consumer lending illustrates an evolutionary trend that reaches back decades, rather than a revolutionary change in recent years.

The story of consumer credit in America is one of relentless competition and innovation as the forces of creative destruction have swept away older forms of consumer credit and replaced them with newer types. Central to this story in the second half of the twentieth century is the rise of credit cards. Many commentators see credit cards as uniquely pernicious innovations that have led to disastrously high levels of consumer indebtedness. To understand why this is not the case, it is essential to look back at the use of consumer credit in America.

Consumer Credit in Early America

In pre-Civil War America most Americans were farmers living outside major population centers. Gold and silver coins were scarce. Personal credit, however, was not, and farmers relied on credit to smooth investment and consumption across the crop-harvesting season. Credit, as much as the Conestoga Wagon, conquered the West.

After the war, a tide of immigrants swept into America and built the great cities. Largely unskilled blue-collar workers with unpredictable employment and income, they relied on the consumer credit industry to cope with those uncertainties. In time the emerging American middle class became homeowners and home furnishers through mortgages and consumer installment credit. Overall, late-nineteenth-century households sought financial assistance from five major credit sources: pawnbrokers, illegal small-loan lenders, retailers, friends and family, and mortgage lenders. In post-Civil War New York City, for instance, two-thirds of the city’s total consumer lending came from small-loan agencies, including loan sharks and “wage assignment” lenders, forerunners to today’s payday lenders. Pawn shops proliferated–in some neighborhoods virtually the entire population had a pawn ticket at all times, and as many as 12 in the winter when factories typically closed down, Calder writes. These various lenders charged interest rates approaching 300 percent annually and resorted to embarrassing and aggressive collection practices to enforce repayment of these illegal debts. (Interest rates on these loans were comparable to modern payday lenders.) Counterproductive usury regulations made operations unprofitable for legitimate lenders, former Federal Reserve Chairman Alan Greenspan has pointed out, driving many urban consumers into the hands of illegal lenders. In 1911 an estimated 35 percent of New York City’s employees owed money to illegal loan sharks, a situation Greenspan described as “virtual serfdom.”

The most important source of short-term credit for lower-income Americans, however, has been friends and family. Even today, a recent survey of households in low- and moderate-income areas of Los Angeles, Chicago, and Washington found that 53 percent of respondents said they would rely on friends or family to borrow $500 for three months. A recent survey of low-income women in Boston found that 93 percent had actually borrowed money from friends and family in the past and many had lent money to friends and family as well. Ten percent of those surveyed have borrowed only from friends and family. But friends and family obviously are not a reliable source of credit.

Consumer credit expanded following World War I. Credit unions, small local savings banks, and a national network of licensed consumer finance companies, such as the Beneficial Industrial Loan Corporation and the Household Finance Corporation, provided consumer loans. These installment loans obliged the consumer to repay a fixed sum plus interest over a fixed period in equal installments.

Beginning with Singer sewing machines, installment credit soon spread to furniture, pianos, household appliances, and finally to automobiles. By the 1930s most sales of household furniture, appliances, radios, cameras, and jewelry were credit sales, as were a substantial percentage of rugs, hardware, sporting goods, and books (such as encyclopedia and other book sets). Financing these purchases through credit made it possible to acquire and use the goods immediately, rather than having to save for long periods of time to afford them. Between 1900 and 1939 total consumer nonmortgage installment debt quadrupled in real dollars, increasing 2 1/2 times during the 1920s alone.

Consumer debt exploded in the 1940s and 1950s during the postwar migration to the suburbs as consumers used credit to buy new cars and to fill their new homes with new furniture and appliances. The ratio of consumer credit to household assets rose from about 1 percent to over 3 percent from 1945 to 1960, where it has hovered ever since.

Today’s concerns about credit cards echo similar paternalistic comments about the spread of installment credit. Installment selling allegedly induced overconsumption by American shoppers, Calder notes, especially by supposedly vulnerable groups such as “the poor, the immigrant, and the allegedly math-impaired female.” By the same token, rapacious installment sellers supposedly led unworthy borrowers to purchase unnecessary products, generating overwhelming debts, by extending credit. Department stores were criticized for “actively goad[ing] people into contracting more debt.” Critics called installment selling a “menace” that trapped Americans in “a morass of debt” and the “first step toward national bankruptcy.”

Moreover, although most Americans believed that installment selling was a “good idea” in general and were confident in their own ability to use it responsibly, three out of four also thought that their neighbors used installment credit excessively–a judgment mirrored in modern surveys of consumers about credit card use.

Overall, most Americans use credit cards responsibly. Less than half of credit card owners carry a balance, and the median value of revolved balances is about $3,000, with a mean of $7,300. Thus, the typical credit card user carries no balance, and most of those who do carry only a modest balance, especially compared to their mortgages, auto loans, or student loans. The fact that some people misuse credit cards–just as they misused installment credit in the past–does not justify reducing access and raising costs to millions of those who use their cards responsibly.

Early Credit Cards

The dawn of the age of credit cards was just an evolution of this trend. Although department stores, gas companies, and hotels began using crude versions of credit cards even before World War I, the modern age of credit cards began with the introduction of Diner’s Club in 1949. Diner’s Club, unlike its predecessors, was a third-party card honored by many merchants. Diner’s Club bore the risk of nonpayment, not the merchant. In return for this assured payment and convenience, participating merchants paid a 7 percent fee for each use.

But universal third-party cards took off slowly. Retail store credit cards dominated the consumer credit market through the 1970s, primarily because usury laws restricted certain types of consumer lending. Usury regulations generally produce three types of unintended consequences. First, they encourage lenders to “re-price” other terms of their credit contracts to try to offset the inability to charge market rates of interest, such as requiring larger down payments, higher upfront fixed fees or annual fees, shorter grace periods, or myriad other terms. Second, usury regulations lead to product substitution, such as switching to less-preferred types of credit like pawn shops or payday lenders. Third, to the extent re-pricing and switching are not fully possible, some borrowers may be unable to get any legal credit on any terms. All three phenomena appear to have resulted from the usury regulations imposed in the 1970s.

A rapid rise in underlying interest rates in the 1970s combined with usury caps made credit card operations for banks unprofitable. Thus bank-type credit card operations remained modest. Banks avoided some of the restrictions by altering other terms of the cardholder agreement or bundling lending with other services. Banks in states with strict usury regulations restricted their hours of operation, reduced customer service, tied their lending operations to other products and services not restricted in price (such as requiring checking or savings accounts), or imposed higher service charges on demand deposit accounts or checking account overdrafts. Most important, to evade usury regulations credit card issuers imposed annual fees, usually ranging from $30 to $50. (Because this fee was assessed on revolvers and transactors alike, it effectively resulted in transactors subsidizing lower interest rates for revolvers.)

Issuers adjusted other terms of the credit contract to compensate for the inability to charge a market rate of interest, including adjusting grace periods and using alternate methods for calculating interest charges. Credit card issuers also rationed credit card privileges to only the most creditworthy consumers, forcing others to turn to less-attractive types of credit.

Credit-issuing department stores had an even more effective way of evading usury restrictions: They could simply bury the credit losses in the price of the goods they offered and sell the bundled product. For instance, prices on major appliances, typically sold on credit, were significantly higher in states with the strictest usury caps. Retailers in these states also reduced their services to consumers. Usury laws also provided large retailers with a substantial comparative advantage over smaller competitors who could not afford to establish and maintain their own credit operations.

Credit Cards Today

In 1978 the Supreme Court effectively deregulated interest rates on credit cards by holding that the applicable rates for nationally chartered banks would be those of the issuing bank’s home state, rather than of the consumer (Marquette National Bank v. First of Omaha Corp.). The results have been dramatic. In 1970 only 16 percent of American households had a general-purpose bank-type card; today 71 percent do.

By effectively eliminating usury regulations, Marquette eliminated the incentives to engage in term re-pricing. Beginning in the early 1990s credit cards eliminated annual fees on standard cards, making pricing more efficient and more consumer-friendly, and enabling consumers to hold multiple cards simultaneously. This spurred heated competition that has led to lower interest rates, the general elimination of annual fees, and a proliferation of card benefits.

Credit cards have grown at the expense of layaway and installment-purchase plans important to the sales volume at many retail stores in earlier decades. The same applies to all unsecured credit products.

While pawn shops, layaway plans, payday lenders, check cashers, personal finance companies, retail store credit, rent-to-own, loan sharks, and friends and family have all served as important sources of consumer credit in American history, those who use these high-priced and inconvenient lending products today do so because they are unable to get credit cards at all or have reached their credit limits.

Beware Well-Intentioned Regulations

As this brief history suggests, falling prices and growing consumer choice over time have defined the dynamic of consumer credit. Consumers today are no longer captives of local banks or pawnbrokers. Instead, they can choose from over 6,000 issuers of credit cards operating in a national market. Instead of being forced to buy their new stereo or television from the local department store just because that is the place that happens also to offer credit, consumers can buy appliances at small boutiques, through a catalogue, or online, and use their general bank card to pay for them.

As a consequence of the general tightening of credit markets over the past year, however, consumers and small businesses have lost some access to the lower costs and more flexible terms of credit cards. According to news reports, the response has been a migration toward greater use of alternative types of credit–like pawnshops, layaway plans, and payday lenders–by middle-class borrowers and small businesses. Drying up access to credit card credit will roll back the clock to these old forms of credit that had been thought long abandoned.

Historically, though, the greatest threat to modernization of consumer credit has been the heavy hand of government regulation. Like usury laws, the so-called Credit Cardholders’ Bill of Rights, passed earlier this year, can be expected to have many unintended consequences, too. For example, it prohibits issuers from raising rates “retroactively” on outstanding credit card balances. This proposal, however, ignores that fact that unlike traditional installment credit, a credit card loan amounts to a new loan every month–hence the name “revolving.” Similarly, consumers can pay off balances with no prepayment penalty by switching to a new, lower-interest card. Under the new regulation consumers can always reduce their interest rate by switching cards, but the credit card issuers are prohibited from raising rates when economic conditions change. As a result issuers will be reluctant to offer lower rates on the front end. This will mean less flexibility and higher rates for all consumers.

Once again we’ll see that the Law of Unintended Consequences can’t be repealed.