A half-hour east of Seattle, not far from the headquarters of Microsoft, Amazon, and other icons of the digital revolution, reSTART, a rehab center for Internet addicts, reveals some of the downsides of that revolution. Most of the clients here are trying to quit online gaming, an obsession that has turned their careers, relationships, and health to shambles. For the outsider, the addiction can be incomprehensible. But listening to the patients’ stories, the appeal comes sharply into focus. In a living room overlooking the lawn, 29-year-old Brett Walker talks about his time in World of Warcraft, a popular online role-playing game in which participants become warriors in a steampunk medieval world. For four years, even as his real life collapsed, Walker enjoyed a near-perfect online existence, with unlimited power and status akin to that of a Mafia boss crossed with a rock star. “I could do whatever I wanted, go where I wanted,” Walker tells me with a mixture of pride and self-mockery. “The world was my oyster.”

Walker appreciates the irony. His endless hours as an online superhero left him physically weak, financially destitute, and so socially isolated he could barely hold a face-to-face conversation. There may also have been deeper effects. Studies suggest that heavy online gaming alters brain structures involved in decision making and self-control, much as drug and alcohol use do. Emotional development can be delayed or derailed, leaving the player with a sense of self that is incomplete, fragile, and socially disengaged—more id than superego. Or as Hilarie Cash, reSTART cofounder and an expert in online addiction, tells me, “We end up being controlled by our impulses.”

Which, for gaming addicts, means being even more susceptible to the complex charms of the online world. Gaming companies want to keep players playing as long as possible—the more you play, the more likely you’ll upgrade to the next version. To this end, game designers have created sophisticated data feedback systems that keep players on an upgrade treadmill. As Walker and his peers battle their way through their virtual worlds, the data they generate are captured and used to make subsequent game iterations even more “immersive,” which means players play more, and generate still more data, which inform even more immersive iterations, and so on. World of Warcraft releases periodic patches featuring new weapons and skills that players must have if they want to keep their godlike powers, which they always do. The result is a perpetual-motion machine, driven by companies’ hunger for revenues, but also by players’ insatiable appetite for self-aggrandizement. Until the day he quit, Walker never once declined the chance to “level up,” but instead consumed each new increment of power as soon as it was offered—even as it sapped his power in real life.

On the surface, stories of people like Brett Walker may not seem relevant to those of us who don’t spend our days waging virtual war. But these digital narratives center on a dilemma that every citizen in postindustrial society will eventually confront: how to cope with a consumer culture almost too good at giving us what we want. I don’t just mean the way smartphones and search engines and Netflix and Amazon anticipate our preferences. I mean how the entire edifice of the consumer economy, digital and actual, has reoriented itself around our own agendas, self-images, and inner fantasies. In North America and the United Kingdom, and to a lesser degree in Europe and Japan, it is now entirely normal to demand a personally customized life. We fine-tune our moods with pharmaceuticals and Spotify. We craft our meals around our allergies and ideologies. We can choose a vehicle to express our hipness or hostility. We can move to a neighborhood that matches our social values, find a news outlet that mirrors our politics, and create a social network that “likes” everything we say or post. With each transaction and upgrade, each choice and click, life moves closer to us, and the world becomes our world.

And yet … the world we’re busily refashioning in our own image has some serious problems. Certainly, our march from one level of gratification to the next has imposed huge costs—most recently in a credit binge that nearly sank the global economy. But the issue here isn’t only one of overindulgence or a wayward consumer culture. Even as the economy slowly recovers, many people still feel out of balance and unsteady. It’s as if the quest for constant, seamless self-expression has become so deeply embedded that, according to social scientists like Robert Putnam, it is undermining the essential structures of everyday life. In everything from relationships to politics to business, the emerging norms and expectations of our self-centered culture are making it steadily harder to behave in thoughtful, civic, social ways. We struggle to make lasting commitments. We’re uncomfortable with people or ideas that don’t relate directly and immediately to us. Empathy weakens, and with it, our confidence in the idea, essential to a working democracy, that we have anything in common.

Our unease isn’t new, exactly. In the 1970s, social critics such as Daniel Bell, Christopher Lasch, and Tom Wolfe warned that our growing self-absorption was starving the idealism and aspirations of the postwar era. The “logic of individualism,” argued Lasch in his 1978 polemic, The Culture of Narcissism, had transformed everyday life into a brutal social competition for affirmation that was sapping our days of meaning and joy. Yet even these pessimists had no idea how self-centered mainstream culture would become. Nor could they have imagined the degree to which the selfish reflexes of the individual would become the template for an entire society. Under the escalating drive for quick, efficient “returns,” our whole socioeconomic system is adopting an almost childlike impulsiveness, wholly obsessed with short-term gain and narrow self-interest and increasingly oblivious to long-term consequences.

This new impulsiveness is most obvious in the business world, where an increasingly fanatical and self-justifying emphasis on quarterly earnings, share price, and executive bonuses has led to a pattern of self-serving, high-risk strategies. This “short-termism,” as economists call it, helped bring down financial markets in 2008—and it continues to destabilize the economy and the job market and undercut the future of the middle class. French economist Thomas Piketty blames rising inequality on capital’s natural tendency to replicate faster than the overall economy. But the more immediate culprit may be the institutionalization of an economic model so focused on quick, self-serving rewards, and so inured to long-term social costs, that it is destroying the economic foundations on which real prosperity depends.

This industrial-scale impulsiveness isn’t confined to the business world. The media, academia, nonprofits, and think tanks—the very institutions that once helped counter the individual pursuit of quick, self-serving rewards—are themselves obsessed with the same rewards. Most troubling, our political institutions, once capable of mobilizing resources and people to win wars, solve problems, and drive real progress, now settle for rapid wins while avoiding complex, perennial challenges, such as education reform, climate change, or preventing the next financial meltdown. The worst recession in three quarters of a century should have led us to rethink an economic model based on automatic upgrades and short-term gains. Instead, we’ve continued to focus our economic energies, entrepreneurial talents, and innovation on getting the biggest returns in the shortest time possible. Worse, we’ve done so even though fewer and fewer of us can afford to keep up with the Sisyphean pursuit of ever-faster gratification—a frustration expressed in the angry populism now paralyzing our politics. From top to bottom, we are becoming a society ruled by impulse, by the reflexive reach for quick rewards. We are becoming an Impulse Society.

To truly understand our predicament, we must realize that this crisis is a consequence not of our failures but of our extraordinary successes. Over the past century, and especially the past four decades, we have created a sophisticated, self-feeding socioeconomic system that is marvelously efficient at catering to our desires. Even as the economy has split between the haves and the have-nots, the miracles of cost-reducing business strategies and powerful personal technologies mean that all but the poorest among us have nonetheless gained an extraordinary measure of self-gratifying power. Indeed, in many respects, our economic system now indulges our desires with such speed and efficiency and personalized precision that it’s getting harder to know where we stop and the market begins. I don’t merely mean that clever marketers have gotten inside our heads or that our smartphones now feel like body parts—although both are true. I mean that our preferences, attitudes, and identities have become so intertwined with the offerings of the marketplace that we have internalized many of the market’s values and reflexes—particularly the market’s relentless drive for ever greater, ever faster, more efficient returns. Put another way, the marketplace and the self, our economy and our psychology, are fusing in ways we’ve never before experienced.

If we could step back a century, before the rise of the consumer economy, we would be struck not only by the lack of affluence and technology but also by the distance between people and the economy, by the separation of economic and emotional life. People back then weren’t any less wrapped up in economic activities. The difference lay in where most of that activity took place. A century ago, economic activity occurred primarily in the physical world of production. People made things: they farmed, crafted, cobbled, nailed, baked, brined, brewed. They created tangible goods and services whose value could be determined, often as not, by the measurable needs and requirements of their physical, external lives.

That relationship changed with the rise of the consumer economy. Sophisticated, large-scale industrial systems assumed the task of making many of the things we needed, and also began to focus on the things we wanted. As the consumer economy matured, an ever-larger share of economic activity came from discretionary consumption, driven not by need but by desire, and thus by the intangible criteria of people’s inner worlds: their aspirations and hopes, identities and secret cravings, anxieties and ennui. As these inner worlds came to play a larger role in the economy—and, in particular, as companies’ profits and workers’ wages came to depend increasingly on the gratification of ephemeral (but conveniently endless) appetites—the entire marketplace became more attuned to the mechanics of the self. Bit by bit, product by product, the marketplace drew closer to the self.

For most of the 20th century, this merger proceeded at a gradual pace. But starting in the 1970s, the convergence was kicked into overdrive by two powerful shocks. The first was the collapse of America’s postwar economic boom in the face of high oil prices, inflation, and rising foreign competition. As corporate profits fell, it was clear that many U.S. firms had grown too complacent and inefficient to prosper in a faster, more global economy. With company shares trading at historic lows, activist investors launched an economic coup. They bought struggling companies, broke them up, and sold the pieces, often for substantial profit. As takeover fever spread (encouraged by the parallel deregulatory fever then sweeping Washington), even healthy companies embraced defensive strategies to boost profits and share prices and keep investors happy. Companies laid off workers and began moving operations overseas. As important, they began paying their executives in company stock, thereby ensuring that those executives would do whatever was necessary to keep share prices high. Corporate strategy and investor desire were now effectively fused.

The “shareholder revolution,” as Wall Street dubbed it without irony, was a shock to the nation’s business psyche. For more than half a century, corporate America, heavily pressured by labor and an openly interventionist government, had hewed to a paternalistic capitalism, under which a large share of profits was reinvested in everything from worker training to community charities. But those times were over, for according to many conservative economists, it was partly such misplaced corporate generosity that had weakened U.S. companies in the first place. For these critics, the only way American companies could help society was, paradoxically, to jettison the older notion that business had any separate, social obligations other than making maximum profit. As Milton Friedman, an icon of conservative economics, argued in a seminal New York Times essay, “There is one and only one social responsibility of business—to use its resources and engage in activities designed to increase its profits.” And the best way for government to help this happen was to turn companies loose, by cutting taxes and regulations, and thereby allow the efficiencies of the marketplace to find the most direct route back to wealth.

The shareholder revolution would have upended American society in any case. But its effects were magnified by a second shock—a potent new technology, the microprocessor, which made computing vastly more powerful and much cheaper. By the 1980s, computer speed was doubling, and computer costs were halving, every two years—a trend, known as Moore’s Law, that quickly transformed every sector of the consumer economy. Business processes, from design to marketing, could now be supercharged and accelerated. (In Detroit, for instance, the time needed to bring a new car from drawing board to showroom fell from four years to 18 months.) By cutting the time between investment and profit, computers gave business a potent new tool to generate the faster returns that Wall Street was demanding—but also to deliver the gratifications consumers were now coming to expect.

The steep curve of Moore’s Law perfectly reflected the accelerating pace of the merger of market and self. Computers not only helped businesses cut costs (from 1970 to 1990, consumer prices fell 26 percent, in real terms) but also to produce and distribute a much larger variety of goods. For instance, whereas a 1950s-era supermarket might have sold 3,000 different products, or “stock-keeping units” (SKUs), by 1990 the number had risen to more than 30,000. In everything from cars and clothing to interior decoration and music, consumers could choose from among a nearly infinite menu of products and services to craft a consumption experience that perfectly suited their individual tastes.

Customization was becoming not just a consumer choice but an approach to life. By the 1990s, many Americans were moving to cities and neighborhoods for the same reasons they might have used to pick the cars they drove or the clothes they wore. As Bill Bishop, author of The Big Sort, observes, whereas we once moved to be near work or family, we were increasingly relocating “for a whole set of ‘life-style’ reasons,” such as cultural amenities, proximity to shopping malls and sports stadiums, and, especially, politics. “People have become very calculating,” Bishop told me. “In ways our parents never even thought of, people are parsing the choice of where to live as if they were going through a menu at a restaurant.”

Even if our faltering incomes present an obstacle to personalization, the new economy has offered fixes here, too. As early as the 1980s, digital technologies meant banks and other companies could not only approve credit in minutes, instead of days and weeks, but also more easily sell loans to Wall Street investors and use the proceeds to make still more loans. As the supply of credit has risen, banks have grown much more creative, and aggressive, in marketing loans for home improvement, college education, vacations, boats, debt consolidation, even cosmetic surgery. These days, it’s hard to think of a product, a service, a lifestyle choice, or even an identity that can’t be financed. Whatever our income level and aspiration, a consumer proposition lets us pursue experiences that deliver the biggest emotional or aspirational “return on investment”—a level of personal efficiency that we have embraced as avidly as the corporate world embraced computers and financial engineering.

But this new efficiency has had serious downsides—not least in the mismatch between the self-gratifying power available to consumers and consumers’ ability to manage it all. Humans, it’s safe to say, were not designed for a world of such easy gratification. Decades of research suggest that our brains, adapted for a prehistoric world of scarce resources and infrequent opportunities, are wired to prioritize immediate rewards and costs and to disregard rewards and costs that occur in the future. This natural bias against the future, so essential for our ancestors, is an Achilles’ heel in a modern economy built around immediate pleasure and deferred pain. Nearly every consumer proposition today, from credit to fast food and entertainment to social media and online shopping, capitalizes on our anti-future bias: in all cases, we’re provided immediate pleasure, while any costs, whether financial, physical, or emotional, are deferred so seamlessly that they vanish from our perception. It’s almost as if the evolving marketplace, once a force for personal discipline and deferred gratification (think of the Protestant work ethic) has flipped. Now, we’re urged to focus only on the present moment, and on maximizing pleasure and minimizing pain in that moment. The notion of future consequences, so essential to our development as functional citizens, as adults, is relegated to the background, inviting us to remain in a state of permanent childhood.

Consider the rising number of consumer products that encourage gratification at someone else’s expense: the mobile technologies that encourage us to violate social norms—taking calls in theaters or posting videos of others’ misfortunes. Or the products that cater to our inner bully: the stereo subwoofers marketed for their capacity to deafen the entire neighborhood and the retina-searing high beams engineered, according to the ads, to fit “your aggressive driving style” (that is, to blind oncoming drivers). Or, for that matter, the SUVs that are not simply massive and overpowered but explicitly designed for an aggressive appearance, with tank-like panels and front ends modeled on predatory animals. Industry critics have argued that such features encourage SUV drivers to drive faster, causing more accidents and, thanks to the extra mass, wreaking more damage on other drivers. Yet this liability is part of the vehicles’ appeal—and part of automakers’ efforts to reach what Clotaire Rapaille, a marketing specialist who worked closely with Detroit, calls “the reptilian brain,” or the set of ancient neural programs that seek to maximize each individual’s survival and reproduction. The reptilian brain doesn’t care about the safety of other motorists. Rather, to the reptilian brain, every other motorist is a potential adversary. As Rapaille told journalist Keith Bradsher in a moment of appalling candor, “The reptilian [mind] says, ‘If there’s a crash, I want the other guy to die.’ ”

The SUV is an extreme example of the way the market encourages the pursuit of narrow, short-term self-interest. But it aptly illustrates both the aggressive character of our consumer culture and the defensiveness and even paranoia that emerge in a world of all-too-easy gratification. For to personalize is to reject the world “as is” and instead insist on bending it to our preferences, as if dominance were our only mode. But humans aren’t meant only for dominance. We’re also meant to adapt to something larger. Our big brains are specialized for cooperation and for compromise—with other individuals and with the broader world, which for most of history did not cater to our preferences or “likes.” Daily survival depended on our ancestors’ capacity to conform themselves and their expectations to the world as they found it. It was only by enduring adversity, disappointment, and delayed gratification that humans gained the strength, knowledge, and perspective that are essential to sustainable mastery. Many traditional cultures regarded adversity as inseparable from, and essential to, the formation of strong, self-sufficient people. Yet the modern conception of character now leaves little space for discomfort or real adversity. To the contrary, consumer culture in the Impulse Society does everything in its power to convince us that difficulty has no place in our lives (or belongs only in discrete, self-enhancing moments, such as really hard ab workouts). Discomfort, anxiety, suffering, depression, rejection, delays, uncertainty, or ambiguity—in the Impulse Society, these aren’t opportunities to mature and toughen or become. Instead, they represent errors and inefficiencies, and thus opportunities for correction—nearly always with more consumption and self-expression.

So rather than wait a few days for a package, we have it overnighted. Or we pay for same-day service. And as the system gets faster at gratifying our desires, we’re less and less likely even to consider the possibility that we might find deeper satisfaction by enduring a delay or some other challenge to our personalized existence. The efficient consumer market cannot abide delay or adversity or, by extension, the strength of character that might be cultivated by delay or adversity. To the efficient market, character is itself an inefficiency to be squeezed from the system. Once some new increment of self-gratifying or self-promoting capability is made available—a faster phone, more powerful car, quicker delivery service—the assumption of the consumer culture is that it must be put to use, whatever the consequences. The intensity of our self-centeredness is now being determined not by conscious decision but by the market.

Such market-driven narcissism is a chilling confirmation of how the convergence of market and self have accelerated recently, as technology, ideology, and economics have supercharged the economy. That was one of the stories behind the Great Recession, which was the logical conclusion to the union of self and market—and demonstrated how this corrupting pattern of aggressive, narrow self-interest has jumped, like a virus, from the individual to our fundamental economic institutions.

This corruption is most evident in the corporate emphasis on short-term profits and share price. Because share price is heavily influenced by a company’s quarterly earnings, today’s supremely motivated managers (who now receive more than half their compensation in shares) go to ever-greater lengths to boost “quarterlies”—even to the point of hurting their companies’ long-term prospects. That short-termism played a part in the rising incidence of accounting fraud: from 1992 to 2005, the number of U.S. firms issuing earnings “restatements”—essentially, admissions that previously reported earnings were bogus—jumped from six a year to nearly 1,200 a year.

More damaging, however, are the entirely legal strategies to boost share price. As business learned long ago, one of the fastest ways to lift quarterly earnings is to cut costs, and since labor is always the largest cost, employees have borne much of the pain of the shareholder revolution and the rush for capital efficiency. Where company managers in the postwar era followed what economist William Lazonick calls a “retain and reinvest” strategy—plowing a large share of corporate profits back into the company via new facilities and higher wages—their post-1970s counterparts have hewed to a strategy of “downsize-and-distribute.” That is, today’s managers cut everything possible and pass along the savings to shareholders (and themselves) in the form of higher dividends and faster stock price appreciation. And if such a harsh strategy began as a legitimate effort to squeeze out postwar corporate waste and inefficiency, it has since become the standard way for executives to show the stock market that they do whatever is necessary to keep share prices high—even if it means upending the economy. Efficiency, says Lazonick, has become a socially acceptable rationalization for giving senior managers “totally free rein to get rid of the labor force and make other changes that [previously] would have been politically difficult—and get paid handsomely for doing it.”

Lazonick’s point underscores the economic paradox at the heart of the Impulse Society. The historic drive for better efficiencies—more output at a lower cost—has been inseparable from human progress. But efficiency has also become an ideology, imposing serious social costs on everything from income inequality to the dysfunction of our politics to the narcissism of our popular culture and private lives. Even as our “returns to capital” soar, our social returns are falling toward zero, and with them our prospects for a stable middle class or long-term prosperity.

What emerges is a massive and accelerating feedback loop, in which narrow self-interest, corporate or individual, undermines prosperity and fosters economic and social insecurity, which then encourages even more narrowly self-serving behavior. Day by day, there seem to be fewer reasons to follow the rules or think beyond oneself or the present moment. Not so long ago, we told our children that success required sustained effort, a willingness to delay gratification, and the capacity to control impulses. Children today, however, see their patient, hard-working parents and grandparents tossed aside like old furniture—while investment bankers and reality TV stars seem to easily make huge amounts of money. Little wonder that cheating is now endemic in high school and college. Or that college and high school kids now routinely film themselves in extraordinarily compromising situations in the hopes of converting millions of “views” into megabucks. Ask a 20-year-old how to get rich, says Keith Campbell, a psychologist and expert on the subject of narcissism, and you’ll get three answers: “‘I can either be famous on reality TV, or I can go start a dot-com company and sell it to Google in about a week, or I can go work for Goldman Sachs and just steal money from old people.’ I mean, those are the three models of wealth. There just isn’t a good model of hard work getting you somewhere anymore.”

In an ideal world, our social institutions temper our myopic, narrowly self-serving reflexes. That has been the script for most of human history. Society’s taboos, customs, and institutional arrangements, such as family, marriage, capitalism, and democracy, have discouraged impulsiveness in favor of long-term commitment or investment. But today, these institutional bulwarks are under the same myopic spell as individuals are. Bottom-line capitalism squeezes out socially responsible commerce as inefficient. Community and family are undermined by our consumer culture of individual gratification. Worse, our political system, the traditional arbiter between public and private interests, has been colonized by the same bottom-line impulse. Political parties boil their philosophies down into extreme brands designed to provoke target audiences and score quick wins. Voters are encouraged to see politics as another venue for personalized consumption. We’ve lost the idea that politics is the means to build consensus and an opportunity to participate in something larger than ourselves.

We know the result: a national political culture more divided and dysfunctional than any in living memory. All but gone are centrist statesmen capable of bipartisan compromise. A democracy once capable of ambitious, historic ventures can barely keep government open and seems powerless to deal with challenges like debt reduction or immigration, which Washington should be grappling with but isn’t.

Where do we go from here? How do we revive an economy, a culture, and a collective future with people and institutions seemingly locked in the pursuit of ever-narrower self-interest?

Paradoxically, it is in our dysfunctional political realm where we can glimpse a way forward. Consider the unlikely promise of younger Americans, who for all their impulsiveness and reputed disengagement from formal politics show signs of coming alive as a political force. Surveys find that although Millennials don’t vote nearly as regularly as their older peers, they are more actively involved in other ways (such as volunteering) and are far less enamored of their elders’ extreme politics. It isn’t just the young who are tired of our current political atmosphere. After years of partisan gridlock, we may be witnessing the emergence of a new middle. Recent surveys show large numbers of voters, Republican, Democratic, and independent, who agree on a range of issues—from abortion rights and background gun checks to the minimum wage and the separation of church and state. Although these new centrists hardly vote as one, notes Kathleen Parker, a right-of-center columnist at The Washington Post, they “share a disdain for ideological purity.” And if our political leaders are not ready to repudiate rigid ideology, they are painfully aware of voters’ growing impatience. Since the 2012 election, when Tea Party extremists dragged down the Republican Party and then shut down the government, many mainstream Republicans have become desperate to move the party’s brand back toward the political center.

In the aftermath of the 2013 government shutdown, it became possible to see another kind of politics. After months of partisanship brinksmanship, lawmakers were forced to step back, however briefly, from the short-term playbook and craft some small but crucial compromise funding bills. No one expected the peace to last—the discord machine was merely idling before the 2014 midterms. (Just ask Eric Cantor.) But it was enough to show that what political players need, first and foremost, is a way to make space between themselves and reflexive politics—space to think, to deliberate, to choose a course of action rather than having one dictated by the momentum and algorithms of impulse politics.

In that brief moment, Washington, capital city of the Impulse Society, was showing the rest of us how that society might be disarmed. On a purely political level, it was possible to see how lawmakers might, by demonstrating even modest bipartisanship, begin to chip away at the cynicism that now encourages self-centered behavior among voters. One could imagine how even a single political success story—real campaign finance reform, say, or meaningful financial regulation—might inspire voters and energize a broader reform movement, both inside and outside politics, to replace our craving for immediate self-gratification with the more deliberate, community-minded attitude that so many of us say we want.

And the shutdown’s aftermath offered another, more fundamental lesson: how important it is to stand up to the forces of the Impulse Society—to push back against the momentum and values of a bottom-line culture so that a more deliberate, human set of values can again flourish.

A revolt against the values of the Impulse Society is already under way. In almost any community, you can find people working to separate themselves from a system that puts efficiency ahead of other values. It might be the household down the street that unplugs its smartphones and social networks to reclaim some family time. It might be the Wall Street trader who quits to become a math teacher. Or the political junkie who swears off Fox or Daily Kos because the media echo chamber is destroying his or her belief in democracy. And it’s people like Brett Walker, setting themselves free from the digital underworld. This uprising against the Impulse Society may be undeclared, but it’s happening anywhere people have recognized that they are losing something essential and irreplaceable.

Yes, such efforts have thus far been halting, disparate, and disorganized, thwarted by inertia and a fear of change. We don’t want to lose the rewards the Impulse Society lavishes us with—the steady upgrades from one level of gratification to the next. And, certainly, we fear trying something different in an economy that has become punitive—a wariness that afflicts our entire culture, from the lone shopworker afraid to leave a soul-crushing job to CEOs unwilling to offend their shareholders. Some of this is simple pragmatism, but there is also a large element of capitulation: many of us have grudgingly accepted that everything happening today is the logical, inevitable outcome of an efficient socioeconomic evolution that, by definition, is producing the maximum possible good. It’s as if we have convinced ourselves that the Impulse Society is what social progress looks like now.

But such a conclusion is manifestly and demonstrably false. Other possible economic outcomes exist, with different social and cultural consequences. We could point to alternative models in northern Europe and parts of Asia—Germany, say, or Singapore—where societies have different expectations for their economic systems—and less tolerance for the excesses and indignities Americans seem to regard as unavoidable. For that matter, Americans need look no further than our own history to see how a people can choose to make the economy produce more of the essential outputs—transportation infrastructure, say, or energy research or education—that are necessary for individuals and for society as a whole. Conservatives often dismiss such alternative economic scenarios, whether from abroad or from our own past, as case studies in liberal overreach and unwarranted government intrusion into the marketplace—and these complaints are not entirely undeserved. “Commanding” the economy has risks. But the basic argument—that it is possible and therefore necessary to take steps to produce more sustainable, equitable, humane economic outcomes—is neither flawed nor particularly liberal.

From the beginnings of the Industrial Revolution, it was understood that “commercial society,” as Adam Smith called capitalism, needed constant prodding to ensure that its massive efficiencies benefited as wide a public as possible. As Smith wrote in The Wealth of Nations, “No society can surely be flourishing and happy, of which the far greater part of the members are poor and miserable.” Today, conservatives routinely invoke Smith and his invisible hand to argue for unfettered markets. But Smith recognized that markets needed occasional fettering: he favored, among other things, a progressive tax on the wealthy and, especially, hefty regulation of finance to prevent the consolidation of economic power in the hands of the few. Such regulatory intrusions, Smith readily acknowledged, were “in some respect a violation of natural liberty” of bankers and others with economic power. But this measured reduction in the economic power of the few was essential if a nation truly wanted to protect “the security of the whole society.” As Dutch economic essayist Thomas Wells puts it, “Commercial society was for Smith an ethical project whose greatest potential benefits had to be struggled for.” The success of that project, Wells notes, “was not predetermined, but had to be worked for.” The question is, toward what end? What is the “output” we hope to achieve from a post-Impulse economy and how might we begin to get there?

Since the end of the Cold War, it has been unthinkable to call for any alternative to capitalism, or even to imagine that such an alternative might exist. But shouldn’t we at least retain the prerogative to choose the sort of capitalism we want? Or to demand that our capitalism produce things of real value and be capable of sustaining a society that is equitable and deliberate? We are rightly skeptical of the heavy-handed, top-down government style in China, Brazil, India, and Indonesia. But these societies, at the very least, have tried to make their economies take them in specific directions, as opposed to simply following the ideology of efficiency. More fundamentally, they have defined economic success and wealth in social terms. One needn’t agree with those terms to recognize that our terms—the way we measure progress in much of the First World—are no longer sustainable. We badly need a new measure for economic success that goes beyond earnings per share.

What are we waiting for? Brett Walker, the former digital junkie, saved himself by going clean—by breaking away from the continuous gratifications of the gaming economy long enough to discover that he was happier without that degree of self-indulgence. Perhaps we should take note. Like Walker, our society has let the expectation of self-serving gratification drive us into a social and economic crisis so deep we haven’t yet recovered. But unlike Walker, we still haven’t broken free. To the contrary, our solution has been to revive the same self-centered economy that caused us such grief in the first place, and will only do so again.

But we needn’t stick to that plan. Were we serious about interrupting our self-driven downward spiral, we would start by recognizing the limits—social and personal as well as economic—of an ideology that prioritizes immediate gratification and efficient returns over all other values. We’ve made these sorts of shifts before. Not too long ago, our society took on large, complicated problems, such as economic depression and racial injustice, which required collective discipline, deferred gratification, and long-term commitment. We can certainly do so again. In some respects, the challenge we face today is more difficult. But the alternative is no longer an option.