

L ike the loop of Brady Bunch episodes ubiquitously playing across the 500-channel wasteland of cable TV, the decade of the 1970s is a specter haunting the American popular imagination. It is the specter of cheese culture, the jelly-headed amiability of smiley faces, white polyester three-piece suits, and the most recent return of the repressed, Starsky & Hutch, set to a mind-numbing soundtrack of Barry White, Kiss, and “Stairway to Heaven.” Yet the arguably tackiest American decade is also when much of what is termed the postmodern condition began to emerge. The 1970s have been studied from a range of perspectives. Liberals like Boston University historian Bruce J. Schulman assert that the lessons to be lea rned from the decade ultimately attest to the inalienability of individual rights. Conservatives like former Bush administration speech writer David Frum point to the need for moral values as the moral of the story. The erstwhile social historian and now bioethicist Francis Fukuyama professes a more academic perspective, suggesting a broader view be brought to bear in understanding how trends leading up to and beyond the period demonstrate the social-scientific notion of the mutability of social forms. But a closer look at the period reveals major economic and social shifts that these readings neglect. Schulman’s study The Seventies: The Great Shift in American Culture, Society, and Politics (Free Press, 2000) sees the 1970s as the start of a renaissance in American entrepreneurialism and the move to a more libertarian political agenda. These developments were rooted primarily in changes in the nation’s population demographics, following the migrations to the Sunbelt, and a greater emphasis on individual self-reliance in the face of the apparent breakdown of traditional institutions, such as government and the family. The nation emerged from the 1970s with an admittedly more conservative orientation; however, the decade saw many progressive political, social, and cultural achievements. For example, women’s rights were brought into the political and social mainstream, as exemplified by Roe v. Wade in 1973. School bussing and other affirmative-action initiatives were also undertaken. Martin Scorsese, Francis Ford Coppola, and other Hollywood auteurs made their most aesthetically successful films during the decade. And there was a creative explosion in popular music with punk and new wave challenging corporate rock and disco. In making the transition into the 1980s, Schulman asserts, liberalism did not “sell out,” that is, morph from “yippie” to “yuppie,” so much as pragmatically change with the times while retaining its core values of self-actualization. The generation that came of age in the 1970s used market-based strategies to further their notions of progress, including workplace flexibility made possible by decentralized computing and increased opportunities for self-expression through new forms of “lifestyle” consumption. Frum, in How We Got Here: The 70’s: The Decade That Brought You Modern Life, For Better or Worse (Basic, 2000), portrays the 1970s as the time when the excesses of the 1960s came home to roost. Vietnam, desegregation, and inflation converged to reveal the limitations of the modern bureaucratic state, which failed in administering public and private life. Among the ill-fated initiatives: a monetary policy that attempted to cover government budget shortfalls by artificially increasing the money supply, sowing the seeds of lingering economic malaise, and political reforms that discouraged work, undermined traditional mores and values, and otherwise rent the nation’s social fabric. The problems were made even worse by the self-absorbed myopia of the liberal elite, who turned a blind eye to the impending chaos of social disintegration their ill-conceived ideas had set in motion. The resulting crisis revealed the inadequacy of liberalism and its various social welfare schemes for having taken for granted the security provided by the state in pursuing romantic notions of individual rights and personal liberation. Fortunately, Americans responded to the alarm: “Out of the failure and the trauma of the 1970s,” Frum writes, “they emerged stronger, richer, and—if it is not overdramatic to say so—greater than ever.” As the decade turned, the ostensibly popular revolt against taxation, the move toward deregulation, and the “creative destruction” of business innovation set the stage for the prosperity of the 1980s and 1990s that were to come. Where Schulman and Frum limit their accounts to America in the 1970s, Fukuyama surveys a larger context and a longer period of time, the industrialized world from the mid-1960s to the late 1990s. In The Great Disruption: Human Nature and the Reconstitution of Social Order (Touchstone-Simon & Shuster, 1999), Fukuyama argues that rising rates of crime, drug use, unwed pregnancies, and other types of social turmoil occurred not only in the United States but in Canada, Western Europe, and Japan as well. For Fukuyama, these statistics were indicative of what he terms “The Great Disruption” in the social order, seismic repercussions of the restructuring from the industrial to the informational mode of organization. The Great Disruption was essentially a manifestation of the dialectic of Gemeinschaft und Gesellschaft, community and society, the process of social evolution that sociologists have long identified as a characteristic of modernization. According to Fukuyama, in the move from industrial to informational society, intellectual ability came to be more highly valued than physical capacity, changing the nature of work and resulting in the need to mobilize larger pools of workers, particularly women. This affected men in lower-skill occupations in particular, driving down their average earnings as labor competition increased primarily in non-manufacturing sectors and contributing to the erosion of the traditional male-head-of-household model. Lower fertility rates and higher divorce and single-motherhood rates also reflected increased pressures on the family as women entered the workforce in greater numbers. Higher earnings among women promoted their independence in terms of the traditional marriage contract, further eroding the social norm of male responsibility that easy access to birth control also threatened. At stake in this period of transition was “social capital,” which Fukuyama defines as “a set of informal values or norms shared among members of a group that permits cooperation among them.” The Great Disruption in Fukuyama’s reading was an expenditure of social capital, the start-up cost of a new social contract. The new order was based on the “high-trust” participatory systems of the flattened corporate structures of the informational form of work, which were replacing the “low-trust” models of the Taylorist micromanaged hierarchies of industrial production. The Great Disruption would gradually subside as this new order became increasingly secured at all levels of society. Fukuyama’s statistics show a decline after the mid-1990s in the measurements of disruption, suggesting that the social capital of cooperation was once again starting to accrue by the time his book was published on the eve of the new millennium. Both Schulman and Frum reference the energy crisis, inflation, and other economic upheavals in American society in the 1970s. Frum observes that the rising price of energy was not the primary cause of inflation, but an effect of it and the volatile international currency exchange rates that were its result. Fukuyama does not cite individual economic events; yet his analysis of the period, premised on the stresses caused by world-historical changes in the work-based social order, affirms then Economic Council of Advisors Chairman Alan Greenspan’s statement made in the mid-1970s that capitalism was at the time in a state of “crisis.” The crisis was not one of energy or the natural rhythms of Gemeinschaft und Gesellschaft, but of capital accumulation. Simply put, the rate of economic growth of the postwar era in America and elsewhere in the industrialized world simply could not be sustained under existing structural circumstances. In the 1970s, a number of events took place, which helped set reorganization into motion. Analyzing this process from the vantage point of America, as the first nation of the first world, has value in terms of understanding the mechanisms of transnational capital and its effects on global culture, but it must be done with what Immanuel Wallerstein calls the “modern world-system” always in view. In the transition from industrial to informational society, Fukuyama notes, “many new high-skill jobs were created and many low-skill jobs began to disappear.” But by confining his analysis to the industrialized world, Fukuyama obscures a significant aspect of the Great Disruption: in roughly the period he surveys, the number of “low-skill” manufacturing jobs worldwide actually increased more than 30 percent according to Organization for Economic Development and Cooperation/World Bank statistics. Starting in the 1970s, manufacturers in advanced sectors of the global economy began to disaggregate their organizational structures, offloading certain functions (primarily manufacturing operations) to lower-cost zones, typically in lesser-developed regions of the world. The 1970s were when Nike was officially incorporated, from the very beginning using Asian suppliers to manufacture its products. This is also when the Big Three American automobile companies began to heavily invest in Mexico. Changes on the production side coincided with new demands on the part of consumers. Schulman, Frum, and Fukuyama all note the rising tide of individualism in America and the rest of the industrialized world during the 1970s, a demiurge of self-sufficiency in a society perceived to be in flux. The cultural changes of the 1970s were in one sense a culmination of the critique begun after the Second World War of the “outer-directed” individual, the person whose self-definition was derived from the authority of external social associations, such as corporate affiliation and nationality. (The outer-directed thesis was initially articulated in such 1950s classics as The Lonely Crowd by David Reisman with Nathan Glaser and Reuel Denny, White Collar by C. Wright Mills, and The Organization Man by William H. Whyte.) This rejection of authority took the form of new patterns of consumption, embracing innovative products then coming onto the market. It was expressed in the surge of interest in physical culture and individual bodily health, which manifested itself in regimens of self-discipline for men and women alike. The demand for healthy-lifestyle accessories was satisfied by new athletic apparel companies such as Nike. Another was the desire for natural fibers in clothing, in part a rebellion against the standardized ideals of mass-manufactured modernity and the “top down” forms of bureaucratically administered society they seemed to represent. The market for natural fibers was especially dependent on the reorganization of the cycle of transnational production and consumption and its new economics. After the Second World War, man-made fibers such as Dacron, orlon, and other wash-and-wear fabrics emerged in the consumer marketplace as examples of modern-day convenience made possible by the marvels of technology. In addition to ease of maintenance, synthetic fibers offered value and style, the latter due to the material’s ability to accept and retain high-chroma dyes without fading. Anthropologist Jane Schneider notes that the market penetration of polyester synthetics peaked coincidentally with the first energy crisis in 1973, by which time they had become “commoditized,” i.e., the perceived difference among products in the consumer’s mind was strictly a function of price. Synthetic fiber production is capital intensive, requiring large physical plants for chemical compounding and manufacturing, which conflicted with the crisis of accumulation then under way. Many of the world’s lesser-developed regions are located in warmer climates more suitable for the production of natural fibers, such as cotton, flax, and silk. Natural fiber production does not require expensive facilities other than fertile land. The management of production is easily decentralized under the expanded information and communications technologies available to advanced industrialized nations. Lesser-developed regions also had large pools of low-cost labor awaiting mobilization. The natural fiber clothing that began to appear in the 1970s featured value-added attributes, such as ornamental stitching and form-fitted tailoring, made possible by the labor-intensive handwork of these low-wage workers. The alignment of production with consumer demand can be readily seen in one of emblematic product innovations of the 1970s, designer jeans. Originally created for practical purposes by Levi Strauss & Company in the second half of the nineteenth century, blue-denim jeans were first taken up as fashion items in the 1920s by members of the Santa Fe artist’s colony in New Mexico to express their identity as cultural prospectors of the American Scene. Jeans as anti-fashion emerged after the Second World War to symbolize the rebellious independence of the Beat Generation against the gray-flannel conformity of outer-directed society. In the 1960s, jeans gained broad popularity as quintessentially American, representing a classless society where labor and leisure were equally valued and where material comfort prevailed. The individualistic connotations of blue denim stayed intact in the 1970s, when designer jeans appeared as part of a larger movement in consumption toward what may be termed the democratization of distinction, the birth of what philosopher Jean Baudrillard calls the “commodity-sign.” In addition to the value-added features of other natural fiber clothing, designer jeans carried the premium marker of haute couture branding, but at relatively affordable prices when compared to traditional luxury goods. The rise of designer jeans was facilitated by the introduction in the 1970s of computer-assisted methods, known as “geodemographic clustering systems,” that could compile and analyze large amounts of data on the location of wealth, the distribution of population by race, age, gender, and educational level, and other indicators of purchasing power and patterns of behavior. These techniques presented information at finer levels of detail than had been previously available to purveyors of the mass market. This enabled clothing manufacturers, among others, to identify and respond more quickly to market opportunities, which low-cost, labor-intensive offshore producers could supply without needing otherwise prohibitive capital investment. The designer jean consumer was upscale (or at least aspired to be), paying double over the average price of a pair of Levi’s. While fewer units were sold, due to the smaller pool of potential buyers, profit margins were far higher than available in the traditional mass-market model. At the same time that these smaller market segments were being uncovered, the ability to accommodate design changes was enhanced, again due to the economies of low-cost labor engaged in handwork. This served to shorten the fashion cycle, accelerating the process of product obsolescence. During this same period, polyester was pushed down market, relegated to the lower-margin segments of mass consumption. Designer jeans worked alongside the new trademarked running shoes, logo-imprinted t-shirts, and other branded items to help a reconstituted capitalism straddle the globe with renewed vigor. As part of the process, branding became a kind of system of symbolic exchange, a re-enchantment of the world, a mechanism for channeling consumption into new forms. Facilitating the flow between consumer desire and value-added products were other changes that undermined the worldly asceticism traditionally associated with America’s Protestant heritage. In The Cultural Contradictions of Capitalism, first published in 1976, Fukuyama’s mentor Daniel Bell noted a shift in American society that amounted to what he termed the “death of the bourgeois world-view.” Through the market operations of consumerism, Bell argued that the normative values of the Protestant ethic were being demolished, leaving capitalism without a moral center. While concerned most directly with the effects of this development on culture as a shared system of beliefs (i.e., Fukuyama’s social capital), Bell acknowledged a more fundamental change within the economic system. In the 1970s, the contradictions Bell noted were being resolved, but not by returning to the austere ways of the Puritan. The answer to the crisis of capitalism was to destabilize the notion of thrift. Schulman and Frum cite some of the effects during the 1970s of Regulation Q, which since 1934 had given the Federal government authority to set interest rates paid on savings deposits held by banks. Regulation Q was supposed to manage inflation by providing incentives or deterrents to save, depending on market conditions. In inflationary times, the theory held, keeping interest rates low would discourage savings, thereby reducing funds on deposit. This would limit the supply of available credit, in turn reducing consumption and bringing inflation into line. But several financial innovations of the 1970s rendered Regulation Q ineffective. (It was repealed in 1980.) Money-market and mutual funds were introduced, which skirted the interest-rate limitations of Regulation Q and substituted higher risk and the potential for capital gains in place of regulated guaranteed returns on insured deposits. The Visa credit card network was incorporated, providing a universal, standardized mechanism for obtaining and using unsecured revolving credit. The Fair Credit Reporting Act was passed, giving married women legal access to credit ratings independent of their husbands. The stigma of debt was undercut by relaxation of the bankruptcy laws. Theories also emerged to promote the use of credit as a sound financial strategy. An example can be found in an article, “The Hidden Prosperity of the 1970s,” written by Harvard sociologist Christopher Jencks and published, perhaps not ironically, in The Public Interest in 1984. In order to refute U.S. Census Bureau data suggesting the American standard of living had declined in the 1970s for the first time since the Second World War, Jencks proposed that well being should be measured by factors other than income. Using benchmarks such as physical health, housing, transportation, and food, Jencks concluded that the standard of living for the average American actually improved in the 1970s when compared to the 1950s and 1960s. Some of the reasons he cited were increases in longevity, average home size, number of automobiles per hundred persons, and the amount and type of food consumed. Jencks even asserted that due to the effects of appreciation, inflation, and preferential tax treatments, monthly principal and interest payments on home mortgages were in actuality a form of savings. Jencks also reported on the proliferation of appliances and other household possessions in the 1970s, but failed to explain how these goods were acquired in a period of falling real incomes. Contrary to the Great Depression of the 1930s, consumer demand during the troubled 1970s did in fact remain at relatively high levels and actually increased over the decade according to government statistics. Besides subsidized payments of the welfare state, this activity was sustained by two factors: the rise in household incomes due to the entry of more women into the workforce and the growth of consumer credit exclusive of home mortgages. U.S. Census data from the period show that the number of working women doubled, although their representation among the working class (in occupations such as clerical, retail, and non-production manufacturing) outpaced their representation in the managerial segment by nearly thirteen times. Also during the 1970s, installment debt increased annually at double-digit rates. Higher debt ratios were managed with lower down payments and extended repayment terms, which increased the cost of goods purchased with unsecured credit. According to the feminist scholar Jane S. Smith, the impetus to consume was further fed by the transformation of American life, from the suburban migration and the need for automobiles and other durables of the family economy to ideological changes that excused men from their traditional roles, leaving large numbers of female-headed households in their wake. In his 1976 New York magazine article titled the “Me Decade,” Tom Wolfe asserted that postwar prosperity had made possible on a broad social level the self-idealization that had come to characterize the period, all but rendering the term “proletarian” obsolete. “Many millions of middling folk,” as he called them, could afford the luxuries of material pleasure and self-determination previously only available to the well-to-do. In retrospect, Wolfe was decidedly behind the curve. (In fairness, Shulman, Frum, and Fukuyama also missed the broader currents, even with the benefit of decades of hindsight.) The global economy was increasingly coming to operate on a model of first-world consumption and third-world production. Two-income households became an ever more common way to manage family financial obligations. Consumer credit exploded. The so-called American Dream was increasingly being purchased with the promise of future earnings. Thus the position of average workers indeed became such as to bring their proletarian status into question. Instead, a new and improved form of indentured servitude appears to have been introduced. It is with all of this in mind that the question of the postmodern may now be entertained. In Farewell to an Idea: Episodes from a History of Modernism, art historian T. J. Clark writes about modernism’s two overarching desires: “It wanted its audience to be led toward a recognition of the social reality of the sign [. . .]; but equally it dreamed of turning the sign back to a bedrock of World/Nature/Sensation/Subjectivity which the to and fro of capitalism had all but destroyed.” The first desire was to have been realized by understanding that all our representations of the world were cultural constructions, that “we,” that is, humanity, individually and collectively, were ultimately responsible for conceiving the social, historical, political, and economic reality in which we lived. The second desire would be resolved by being able to see beyond the fetish of the commodity (which increasingly penetrated every nook and cranny of our lives) to the finite and necessary conditions of being-in-the-world. It was essentially the opposition of avant-garde and kitsch, the resistance of the commodification of all aspects of existence under the logic of capital versus the acquiescence to it. For Clark, the abandonment of socialism, which coincided with the abandonment of modernism, put the realization of these desires out of reach for the time being at least. Postmodernism embraces the notion that the experienced world is itself “culture all the way down,” as ethnographer Clifford Geertz expresses it, abandoning hope that there might be another, more authentic world to get back to. In Marxist terminology, if modernism was concerned to expose the base upon which capitalism is erected, then postmodernism would appear happy enough to reside within its superstructure. It embraces the fetish of the image, the mystical force of special effects first introduced in the latter half of the 1970s by the likes of George Lucas and Steven Spielberg. From this perspective, postmodern culture can be seen as the most recent expression of the power relations of the global capitalist system upon which America currently sits atop. The postmodern condition is akin to the graphical user interface (GUI) of a personal computer. Like the desktop metaphor of virtual file folders, documents, and wastebaskets, it is a layer of representations hiding the machine’s inner workings from view. It is the logical outcome of capitalism’s perpetual flexibility, the world-historical process of all that was once solid melting into air. “Solid modernity,” as sociologist Zygmunt Bauman terms the social system based on the mass-manufacturing economy, was liquidated in the 1970s. The postmodern turn that unfolded over the decade tracked the shift from vertically integrated assembly line to outsourced sweatshop, production to consumption, commodity to brand, equity to debt, concrete to spectacle. In this light, it makes sense that America’s latest and possibly lamest situation comedy of manifest destiny, a misbegotten, horrific pastiche of All in the Family, M.A.S.H., and Fantasy Island, should have been scripted by the cast and crew brought back to prime time from the Nixon-Ford Administration. It’s yet another rerun of that ’70s show. Vince Carducci spent the American Bicentennial unemployed. He is waiting out the current economic downturn by working on a Ph.D. in sociology at the New School for Social Research in New York City.

