On the college campus where I have been living, the students dress in a style I do not understand. Continuous with what we wore fifteen years ago and subtly different, it is both hipster and not. American Apparel has filed for bankruptcy, but in cities and towns across the US the styles forged a decade ago at the epicenters of bohemia still filter out. Urban Outfitters is going strong. In Zürich, on the banks of the Limmat, elaborate tattoos cover the bodies of the children of Swiss bounty. The French use Brooklyn as a metonym for hip. In this context, in such saturation, hipster can no longer stand for anything, except perhaps the attempt or ambition to look cool. But since coolness venerates its own repudiation most of all, every considered choice bears hipster’s trace. Hipster is everything and nothing—and so it is nothing.

Yet even before hipster petered out, confusion dogged its meaning. Starting in 2009, Mark Greif and his colleagues at n+1 undertook the most serious attempt to date to understand and situate the hipster in context. This realized itself in essays and panel discussions and ultimately a book, What Was the Hipster? Admirable as these efforts were—and Greif’s essay of the same name remains the high-water mark in hipster criticism—something elusive always troubled the boundaries of the concept. As Rob Horning wrote for PopMatters after one such panel, “The participants never really made much of an effort to establish a stable definition of what a hipster is,” a failure that may reflect the impossibility of the task.

Still, if hipster eludes strict definition, one can nonetheless diagnose the confusion that vexed its discussion and, in so doing, back one’s way into an understanding of the phenomenon. The problem always arose in the incongruity of the use of the term and the reality of the type. The word meant to describe the figure, of course, but since the word always carried a pejorative connotation—since those recognized as hipsters would never so self-designate—no one could ever achieve clarity on what, if anything, made up hipster’s authentic core. The term registered inauthenticity. But did it describe latecomers and poseurs, second-wave adopters who appropriated an authentic style (in which case first-wave hipsters might employ it themselves as a term of abuse), or was it always an outgroup epithet for something viewed as exclusionary and pretentious (in which case first-wave hipsters were its object)? This uncertainty repeated itself in a second ambiguity: Did hipster begin as an authentic style, later co-opted by outsiders, or was it always at heart a style of co-optation, as many have argued (tracing its appropriative sweep to punk, queer, skater, hip-hop, and working-class fashions)?

Unpacking the discrepancies between the history of the term and the history of the type sheds some light on these confusions. It also drives at deeper questions about what separates a subculture from a style, and what role a subculture plays in the culture writ large. In its hazy career from alternative lifestyle to disembodied fashion to commodified adjunct in the dominant stylistic wash, the subculture discloses the mechanism of cultural appropriation in action. Its status as critique may come down to the distinction Ben Davis draws, in a chapter on “hipster aesthetics” in 9.5 Theses on Art and Class, between “a norm that is passively inhabited” and a “territory to be claimed.” Like the artists who moved to SoHo in the ’70s or the downtown scene that occupied the Lower East Side in the ’80s, the hipster rose up, in the ’90s, around a claim on new urban territory. But whether the scope of this territory was merely geographic, or if it indeed possessed a spiritual dimension, remains the open question that seems largely to decide the politics of the figure for critics.

Regardless of where one lands on the question of style versus substance, the death and life of the great American hipster offers an alternative history of culture over the last quarter century, one that, for the notable failure of any movement or style to supplant hipsterism, helps to explain the stagnation we find ourselves mired in today.

Genealogy of the Term

The word hip, like its sibling hep, has no known origin. Controversy attends its most attractive etymology: that it derives from hipi (also transliterated xipi or xippi) in the West African language Wolof, meaning to “open one’s eyes” or “be aware.” Lexicographers dispute this derivation, whose chief virtue rests in the relative mediocrity of other explanations, most of which involve the body part: that one lies on one’s hip while smoking opium, that one carries a flask by one’s hip. Hip remains an example of lexical polygenesis: a word that has multiple proposed etymologies and may even have multiple, confluent origins.

The earliest attestations of hip and hep come from 1902–04. Common citations include a 1903 article in the Cincinnati Enquirer for hep and George V. Hobart’s 1904 novel Jim Hickey: A Story of One-Night Stands for hip. It took until the 1930s for the term hepcat to emerge as jazz-scene argot, followed by hipcat and ultimately hipster, which gained prominence in the ’40s, displacing hepcat and other variants. During the ’40s, the approbatory term gathered a critical edge. What began as an evocation of jazz-milieu cool had come, by the time of Anatole Broyard’s 1948 essay in Partisan Review, “A Portrait of the Hipster,” to describe a black figure in Greenwich Village, possessed of distinctive mores and slang and admired by his disaffected white counterparts. Bohemians and jazz-club habitués, per Broyard (himself a mixed-race émigré from New Orleans “passing” as white in New York), idolized the hipster’s insurgent energy and cast him as a demotic visionary: “the great instinctual man, an ambassador from the Id.”

Broyard’s central insight concerned what he called “a priorism”: the sense embedded in the hipster ethos of a proprietary knowledge, neither taught nor received, to which the subgroup alone laid claim. This knowledge realized itself in jive, hipster slang; it meant “knowing the score.” By 1948, Broyard saw decay in the spirited movement. Its spontaneous and subversive character had corroded to a hollow formalism “more rigid than the institutions it had set out to defy.” It had become “moribundly self-conscious,” “a boring routine.” Like any efflorescence of hip, by the time critics and writers awoke to it, it looked tame and assimilated, the province of latecomers and imitators.

The growing significance of jazz for an urban white milieu changed the resonance of hipster in the 1950s. Norman Mailer taxonomized this strain of late adopters in his 1957 essay for Dissent, “The White Negro: Superficial Reflections on the Hipster,” contending that in the face of nuclear annihilation and European genocide a segment of white youth had cut ties with society, turned its back on its roots, and struck out “into the rebellious imperatives of the self.” The goal for this new breed of “American existentialist” was to live wholly in the orgiastic present tense that jazz music, and some vague idea of intuitive black wisdom, embodied.

Mailer revised Broyard’s “a priorism”: It was now white Americans, alienated from their culture, who found a surrogate ethos in the black culture’s response to a kindred, if deeper, alienation. Broyard’s black hipster had adapted to a mainstream culture where he didn’t fit by refashioning his placelessness as a standalone idiom. Mailer’s white hipster took the constant proximity of “instant death” as an excuse, even an imperative, to refuse to sublimate or compromise his desires, adopting instead a spirit of permission and immediacy he associated with the wisdom of marginalization. Like Broyard, Mailer saw the origins of this figure in places like the Village, where black, bohemian, and delinquent elements mixed.

To what extent Mailer’s “hipster” was simply a “beat” isn’t clear. The beat poet Allen Ginsberg invokes “angelheaded hipsters” near the beginning of “Howl,” written during 1954–55, but Mailer’s essay never uses the latter term. By the late 1950s, when hipster achieved its greatest cultural penetration, hippie (or hippy) started appearing as a derogative or diminutive. Hippie shows up in articles and songs from the early 1960s, but its association with the specific countercultural type—the flower child—does not solidify until around 1967. Before that, one can’t rigorously distinguish hippie from hipster or beatnik, and the first writing to use hippie in its modern sense—articles on Haight-Ashbury by Michael Fallon that ran in the San Francisco Examiner in late 1965—employed “beatniks,” “hippies,” and “heads” interchangeably.

This history matters because it emphasizes the semantic crux of hipster, which like hippie always worked as an outsider designation. Those within the group did not self-reflexively adopt the term, except perhaps ironically. This dictated an overwhelmingly negative usage. To call someone a hipster or hippie meant to dismiss or deride that person, and so everything the term evoked—not just individuals, but the paraphernalia and fashion by which such individuals were classified—took on a negative cast. What fell under the “hipster” umbrella was ipso facto inauthentic, lame.

The idiom of hip bifurcated in the 1980s, first attaching to the burgeoning hip-hop movement in the South Bronx. Hipster, which begins its slow resurgent ascent in the second half of the ’90s, peaking in 2004, appears to represent a different, distinctly pejorative spur. Articles on the revitalization of Brooklyn’s Williamsburg from 2000, in the New York Times and Time Out New York, describe “bohemians” and “arty East Village types,” but neither, tellingly, uses hipster. Just three years later, when Robert Lanham’s The Hipster Handbook appears, the term has found its way into widespread use and some consensus on its meaning has emerged.

Genealogy of the Type

The figure of the hipster may well be an example of polygenesis too. No coherent origin story has emerged, and as with any significant current in culture and fashion, multiple tributaries appear to flow together. One could, for instance, envision hip-hop and punk influences in the Lower East Side; Hispanic and skater culture in East L.A.; an Americana element in South Austin; queer and surfer aesthetics in the Bay Area; and suburban irony in East Portland and Capitol Hill, Seattle. Take each of these inflections and weave them together, as people migrate and mass media bring news of the latest styles, and one can imagine a composite fashion as liberally appropriative as hipster emerging in the late ’90s from several decades of subcultural style preceding it.

This story leaves out much nuance, but—obvious though this may be—it reminds us that the style and type precede the rehabilitated term. A new fashion or subgroup necessarily exists before the culture gives it a name and, in so doing, fixes it in the mind as something that can be thought about and discussed. The picture always gets more complicated after the name emerges, since the name introduces a meta layer—the understanding of the thing—which overlaps imperfectly with the thing itself and inaugurates a secondary discourse around authenticity. To name a thing is not necessarily to kill it, but to spark a never-ending tussle between the reality and the concept.

The ur-hipster—the turn-of-the-millennium character outfitted in aviator glasses, “wife beater” undershirt, and trucker hat—looked like your typical ironic urban scrounger at the moment when ’70s and ’80s “white trash” leftovers dominated thrift and vintage stores. The birth of hipster has always been indistinguishable from the advent of contemporary gentrification. As Greif notes, hipsterism marked the turning of a tide when, after a period of white flight to the suburbs, the children of those who had left returned to low-rent (but attractively situated) city neighborhoods that hung on as minority and working-class enclaves. For “mysterious reasons to the participants,” writes Greif, the trappings of ’70s suburban whiteness “suddenly seemed cool for an urban setting.”

But one might probe more deeply whether nostalgia in fact lay behind the aesthetic and whether the new logic of cool truly mystified its exponents. You can certainly argue that hipsters resurrected the iconography of their childhoods out of a disaffected nostalgia, ironic rather than romantic, but the ultimate catholic reach of their stylistic foraging places a certain weight on opportunism. If the style drew force from retro and referential gestures, what hipsters chose to curate their lives may simply have reflected what broader society had cast off: literally, what showed up in secondhand shops and family storage. Fifties-inspired nerd chic had little to do with the ’70s-porn look, after all, although both fed into hipsterism. One side of the style evoked the grainy, sepia-tone aesthetic of the Beastie Boys’ 1994 video for “Sabotage”; another entirely showed up in Weezer’s “Buddy Holly,” also from 1994, whose music video used footage from Happy Days, a ’70s sitcom set in the ’50s.

In such cultural artifacts one sees the first stirrings of the new hipster. By 1996, in recordings such as Beck’s Odelay (and video for the track “Where It’s At”) and Wes Anderson’s film Bottle Rocket, an element of kitschy Americana had joined the mix. (Richard Linklater’s 1993 film Dazed and Confused evinced affection for the ’70s filtered through the prism of the ’90s South Austin flâneur.) The ground for hipster was effectively laid.

But artifacts of this era also remind us of a time before hipster. Kevin Smith’s 1994 film Clerks presents characters who several years later would only make sense as hipsters, but in Smith’s movie, like Nick Hornby’s book High Fidelity (1995), the bored slacker shopkeepers possess traits of frumpy, chatty pseudo-intellectualism not just different from but opposed to those of the hipster. Ryan Schreiber, a figure who seemed to leap straight from the imaginations of Smith and Hornby, founded Pitchfork Media in 1996 while clerking at a record store and living with his parents in the suburbs of Minneapolis. According to a history of Pitchfork written by Michael Gillis for Newcity Music, Schreiber, just out of high school at the time, registered Pitchfork using money from his record store job. His initiation into contemporary music had begun several years earlier, as a teenager watching MTV, after which he discovered the city’s alternative papers and underground music press, what Gillis calls the “legendary fanzines for which Minneapolis is known.”

More than any direct nostalgic recollection of the ’50s, ’60s, and ’70s, the new hipster owed an unrecognized debt of inspiration to MTV and the music video, whose heyday and emergence as a genre coincided more precisely with the childhood and adolescence of those who would become hipsters. Just as rap sampled and collaged the music that had come before it, the music video relished referencing and reworking the tropes of visual culture (especially TV and movies) from the preceding decades. Since the music video couldn’t deviate from its song, it had to rely on visual motifs to advance its own story and artistic vision. The easiest way to establish an immediate context, given these limitations, was to refer back to shared cultural touchstones and styles. Sometimes this meant specific works of art and entertainment, but just as often it meant the texture and character of the media themselves, as these came to evoke eras and decades. The very quality of image and sound, even more than the content, engaged a pleasant, knowing recognition, familiar and ironic, and, most profoundly, wordless.

MTV launched in 1981, shortly after the music video emerged as a recognizable form. The developments in eidetic technologies (film, video, photography, audio recording) in the twenty years before and after meant not only that small variances in the sensory particulars of an image or sound could synecdochically invoke micro-eras as never before, but more powerfully still that such technological nuance infiltrated the very character of subjective memory. Hipster fascination with analog and démodé media—Polaroids, vinyl, arcade games, early Nintendo, boxy computers, 8mm film, and home videos—emerged not from nostalgia directly but from the music video’s wordless confirmation that the artifacts of aesthetic and technological memory translated naturally into a contemporary idiom of hip and, in this marriage, formed a uniquely potent amalgam. The laconicism characteristic of the hipster—which briefly flowered in that most hipster of genres, mumblecore—had everything to do with the unstated irony in these wordless evocations of the past: the past not as words, writing, or speech, but as the successive obsolescence of material technologies and the mediating textures of these technologies. Either you got the joke, in its sublinguistic irony and faintly emo poignancy, or you didn’t.

This also meant that hipster was always a style about style, a fashion about fashion: Its ethos gloried in the tacit recognition, the logic of silent appraisal. This superficiality complicated the terms of its critique. What distinguished early hipsters from other subcultures, perhaps more than political apathy, was that on some level they always wanted to impress, to look good. Many were attractive. A segment seemed to possess, if not to prize, the ethereal otherworldliness of ’90s fashion models like Kate Moss and their aesthetic of “heroin chic.” If one imagines hipster transposed onto the truly misfit and geekoid, it is hard to envision the style rising to the prominence it did. Punk, by contrast, always seemed to make unattractiveness a central plank in its aesthetic.

Hipsters’ claim to superiority did not arise then simply from insider knowledge (“a priorism”), but also from being good-looking. Or perhaps it was their being good-looking, their never being true weirdoes, that made the normal processes of subcultural snobbery so unforgivable in their case. Seeming cool has always involved vapidity, but something about hipsters’ lead-eyed, half-ironic worship of hierarchies of taste felt especially repellent after history’s failure to end scuttled late-’90s apolitical nihilism. The same credo—what we might call ahistorical materialism—showed up in Gap clothing, the TV show Friends, and Puff Daddy’s wealth rap (to cite three contemporaneous examples), but these never attracted the same opprobrium as hipsters because, on the one hand, they made such easy targets and, on the other, they never implicated their detractors. Secretly—or maybe not so secretly—no critic of the hipster mystique remained entirely unbeguiled by it.

Embarrassment dictated the scathing appraisal of the hipster. This was, in one sense, the embarrassment all principled people feel before their own unsupportable aesthetic judgments. As superficial as fashion trends may be, when it comes to trying to look good (and judging others accordingly), not even critics can resist their sovereignty. That hipster “cool” had established itself as a spontaneous inner response disturbed those who believed themselves above such vacuities; it called for public exorcism. The subsequent handwringing over what implicated the community of handwringers mirrored the anguish over gentrification expressed by a community of gentrifiers. With their three colonizing tools—coffee shops, bike stores, and dive bars—hipsters had taken over depressed and minority areas of the city, but instead of appropriating minority culture like Mailer’s hipster, they reappropriated four previous decades of white culture. Gentrification proceeded by disowning the principle while continuing the practice. So hipsters endured by denigrating the attitude while cultivating the fashion.

When with hipster or gentrifier does the pejorative first make sense? A gentrifier does not become a gentrifier until migration to a neighborhood reaches a critical mass. Since it is the latecomers who effectively turn the early migrants into gentrifiers, one can understand why the first wave resents subsequent arrivals, no matter what they mean for property values. Priority becomes the basis of a new pecking order. So with hipster, kids who had caught wind of the new style after a summer in the city or from older siblings brought it back with them to college and established a new “a priorism” along the same lines: the logic of “I got there first.”

Those who had felt the sting of being uninitiated now turned the tables in college, where—fashion apostles—they brought the good stylistic news. Waves of subsequent adoption continued in this manner for several years, with ever finer distinctions, an increasing narcissism of petty differences, and more anxious and uncertain airs of superiority, until at last everyone was partly hipster and everyone partly hated hipsters (now seen as the earlier adopters who looked down their noses at you). By the time any respectable hipster felt obliged to decry hipsterism (just as any respectable gentrifier felt obliged to decry gentrification), taking stylistic steps away from hipster fashion made as strong a stylistic statement as taking steps toward it had made a decade before. In the crosscurrent of this tidal reverse, “normcore” was born, which, for its anti-hip ethos of hip, quickly became just another outgrowth of hipsterism’s hydra-headed late stage, as unconquered today as it is undefinable.

Hipster Goes Mainstream

Thus, by the mid-aughties, the core hipster conundrum was in place. Not only could we never say precisely what a hipster was; more confusing still, everyone involved in the discussion possessed some percentage of hipster DNA, and everyone, according to his purity or impurity, felt entitled to judge everyone else as either overly or insufficiently hipster. In this way hipster started to look like a metadiscourse on the very etiquette of hip: How much was too much? When were distinctions valid and when vulgar? Like any fashion, hipster operated on a subconscious level too, and—when it came to dating, say—even those who had no problem excoriating the hipster as superficially judgmental could hardly help evaluating the coolness or attractiveness of potential partners through a lens fashioned in the hipster imagination. This insincerity is only the insincerity of life, but it makes the self-righteousness of hipster’s critics look a bit silly in retrospect.

Of course, to be fair, the scope and meaning of a cultural moment are impossible to see in the midst of it, and even the sense of retrospective clarity may be nothing more than conversancy with retrospective myths. Because hipster ethnography emerged after the subculture became, in effect, the culture, and long after it seemed possible to track down hipster’s authentic source, the literature always ran into the problem of treating the mainstreaming of a subculture as the subculture per se. Greif identifies the “near-death” of the hipster in 2003 and the figure’s unexpected transformation and resurgence in 2004 as a new type he calls the “Hipster Primitive”: a “green” hipster to succeed the “white.” This eco-hipster traded the appurtenances of industrial America for the iconography of the family vacation to a national park: a crunchy and vaguely feminized version of the earlier figure, who discarded the trucker hat and ’beater for skinny jeans and a flannel shirt.

But was this the renaissance or the death of hipster? If the term indicates a spontaneous style or subculture, exclusive and not yet wholly commodified, then the crucial window of 2003–04 marks less a pivot than a dissipation—the moment when urban migrants to a neighborhood reach a critical mass, the inexorable forces of gentrification set in, and craft-beer bars and farm-to-table restaurants suddenly litter the main drag.

Hipster went mainstream in 2004, just as the radical shift in style took place. Two data points help fix the moment precisely. The clothing company Urban Outfitters, founded under the name Free People in Philadelphia in 1970 and incorporated in 1976, went public in 1993. For its first ten years as a public company, it traded at one or two dollars a share; in December 2000 it traded for less than a dollar, and in February 2003 its stock price stood at $2.20. Over the next two and a half years, however, the company expanded massively, its share price reaching $37 by September 2005 and remaining in this range ever since. (None of this tracks with broader trends in the market or major indexes.) Simultaneously, in fall 2004, per Michael Gillis, “the hit count suddenly exploded for Livestock World, an online purveyor of Arabian horses and Highland cattle.” The connection? Livestock World owned the URL “pitchfork.com,” which Pitchfork Media, the music website founded by Ryan Schreiber, would not take over until 2007.

The meteoric rise of Urban Outfitters and Pitchfork in 2004 suggests a swarm of new entrants into the hipster market—people who, unlike the style’s originators, did not have days to spend thumbing through the racks at thrift stores and attending small-venue concerts to cultivate an obscurantist aesthetic. They wanted the ready-made version, and it was hard to blame them. Being a protohipster snob in 2001, before the advent of any support industry, seemed like a full-time job. On the other hand, the emergence of this industry marked the end of hipster as a subculture. It shed the baggage of any ethos and proceeded to become a fashion, pure and simple.

The fashion now evolved too. In “The Fitted Shirt” (2001), Spoon, that most archetypal of hipster bands, sang I long for the days / They used to say / Ma’am and Yes Sir / For now I’m going to find / Buttons for my / Dad’s old used shirt. This foraging ethos of retro appropriation from which hipster emerged in the late ’90s—which exalted no store of fashion more than the family closet, and which held together, if anything could, the disparate strands of hipster, whether suburban or hip-hop, ’50s or ’70s, hippie or neo-swing—caved in the early aughties to hipster’s commodification. There were attempts to manufacture anew what had gained currency as an item of salvage and reclamation (e.g., the distressed look of the “graphic tee”), but the style also swerved. A lacing of queer and Eurotrash fashion appeared. The term metrosexual came into widespread use during 2003–04, right as distinctions needed to be made between skinny-jeans hipsters and the general mass of “shabby chic.” The environmental or “crunchy” hipster, who rose to become the style’s predominant iteration in this period—and who lives on in the locavore, fair trade, craft-beer, and artisanal-food movements—looked, on further consideration, like a more affluent hippie with twenty-first-century grooming standards. Occupy Wall Street was another face of this development. Such resurgences of natural and communitarian values have recurred periodically since at least the Romantic era, as one response to industrial and urban alienation. But this idea—that capitalistic self-seeking despoils the ecological foundations of life, social and organic—flourished in the late aughties alongside, not as a result of, hipsterism.

What did change the hipster’s tune, and effectively end the era of supercilious apathy, was the 2003 invasion of Iraq. The original hipster had been apolitical. After 2003, with mounting concern about Bush administration overreach, political indifference fell out of fashion. 9/11 brought the country together; Iraq tore it apart. The age of ’90s insouciance choked on its own blithe disaffection, and suddenly, against this new backdrop, the original hipster looked frivolous and even irresponsible. Where early hipsters had made virtues of taciturnity and unpleasantness, the new hipsters were almost painfully nice. They abandoned a jaded and sneering “cool” in favor of sensitivity, kindness, and political vigor, even while preserving the mechanics of distinction and exclusivity as useful methods of self-sorting. It wasn’t that you couldn’t judge people, only that you had to judge them according to different, and putatively better, standards.

We live in the aftermath of this swerve. Late hipster is so many things that it has practically no meaning. As Rob Horning reported in 2009, there were already rumblings about “hipster as the embodiment of postmodernism as a spent force, revealing what happens when pastiche and irony exhaust themselves as aesthetics.” But here again we run aground on the same problematic expansiveness of terminology, because postmodern, like hipster, seems to mean everything and nothing—the great grab-bag of contemporary style. In literature, visual art, and film, what could possibly supplant a postmodern ethos that permits and incorporates everything that precedes or postdates it? Most remember a time when trips to the record store involved browsing well-defined and rigorously segregated music genres. It mattered—to one’s very identity—whether or not you liked niche styles like ska, emo, indie rock, alternative, and underground rap. No one, to the best of my memory, was caught dead liking pop.

But today everything is pop, from country music to rap, and the same people who like to talk about how “good” TV has become make it a point of pride to love pop shamelessly and unrepentantly. Like pop, robbed of any meaning but the warm, fuzzy current of consensus, hipster lost any sense beyond gesturing at the broad territory of “all that is currently hip.” This tendency toward monoculture is the principal fact and force in contemporary culture. Where being hip once meant special access to exclusive knowledge, today’s central clearinghouse—the Internet—has permitted everyone sitting at home to “know the score.” It has meant that everything looks and feels more or less the same for arriving via the same medium: the portable screen. This didn’t just entail a bland, flattening amalgamation of style, genre, and fashion; it killed the subculture, which always relied on the limitations of physical space—of geography, venue, and turf—to enforce its exclusivity, purify its aesthetic, and guard its cachet.

In previous decades, writers like Thomas Frank worried about the counterculture’s co-optation by market forces. But this threat of co-optation turned out, paradoxically, to be what gave subculture its vitality and established a meaningful target for its subversive energy. The subculture’s obstinate and antisocial perversity kept the mainstream honest. We may end up missing our grumpy weirdoes more than we realize. What feels at first comforting and even welcome about consensus and accessibility grows over time into the loneliness and placelessness Broyard’s original hipster sought to dispel. Belonging everywhere starts to feel, at some point, like belonging nowhere.

The End of Distinction

In a last salvo on hipsterism—a late-2010 essay for the New York Times Book Review titled “The Hipster in the Mirror”—Mark Greif reexamines the phenomenon through the lens of Pierre Bourdieu’s work on the educational and class underpinnings of taste. In Distinction: A Social Critique of the Judgement of Taste (1979), Bourdieu made the case that those aesthetic distinctions by which we demonstrate our refinement and sort ourselves socially are, far from natural or immanent, outgrowths of education and social background.

In a broad survey conducted during the 1960s, Bourdieu and his assistants gathered information on the aesthetic judgments and cultural consumption patterns of French subjects, cross-referenced against metrics for class and education, asking, for instance, what would most likely make a beautiful photograph: a landscape, a car crash, a little girl playing with a cat, a butcher’s stall, a sunset over the sea, and so on. Surprising no one, I assume, Bourdieu found that as education levels rose, a greater proportion of respondents rejected “the ordinary objects of popular admiration” and endorsed the idea that an attractive photo could “be made from objects socially designated as meaningless.” Nothing, Bourdieu concluded, so distinguished the different classes as the demand for these discrete attitudes and responses; or, as Greif has it, “The things you prefer—tastes that you like to think of as personal, unique, justified only by sensibility—correspond tightly to defining measures of social class.” The hipster, for Greif, was a local manifestation of this broader dynamic.

But are Bourdieu’s conclusions sound? His survey did not ask participants to propose a beautiful subject for a photograph, but only to choose from a list—one whose options seemed to exert a natural pressure toward the distinction he hoped to find. Those who have learned to push beyond the cliché may feel compelled to prefer the car crash to the sunset, for example, while liking neither. The relative weight given to compunction versus insipidity—what the question appears most truly to draw out—may reflect the tendency Bourdieu noted in different groups to associate the quality of a photograph more or less closely with the character of what it depicts: that is, to distinguish (or not distinguish) between the representation and the thing represented.

Around the time Bourdieu was conducting research for his book, Andy Warhol exhibited his Death and Disaster series (1962–63), whose most enduring work—Silver Car Crash (Double Disaster), a serigraph taken from a photo of a gruesome automobile accident—remains the most expensive Warhol sold at auction to date. Had the more educated or worldly subjects in Bourdieu’s study read about or encountered Warhol’s show, would their seeing the possibility of beauty in the photo of a car crash only reflect class consensus and taste as groupthink? In other words, is something “learned” necessarily performative or insincere?

Such broad correlational trends may indicate less than they appear to at first. They say nothing about each individual’s reason for adopting a certain taste, or idiosyncratic fingerprint of tastes, nor about the widespread variation that exists within any given group. Clearly, proximity to those who hold certain beliefs influences one’s own, but this does not always make the adopted beliefs bad or the adoption thoughtless. What style takes hold—since many don’t—is not entirely arbitrary, nor, as Bourdieu seems to have argued, only a type of signaling or social consensus with no inherent meaning. In short, even if the mythos Bourdieu deconstructed had been true—that taste derives from an inherent superiority of judgment—it is not clear that his experiments would have produced different results.

From this we might conclude that even if the commodification of a style is mercenary and cynical, it does not follow that the commodified style is meaningless or worthless itself. Likewise, even if those who appropriate a style get few points for creativity, they may still get some, since all style, like all belief, is on some level appropriative. Greif writes that “the power of Bourdieu’s statistics was to show how rigid and arbitrary the local conformities were,” such that “college teachers and artists, unusual in believing that a beautiful photo could be made from a car crash, began to look conditioned to that taste, rather than sophisticated or deep.” Whether or not Bourdieu’s statistics are as powerful as Greif claims, the trouble with this formulation is how we know what is arbitrary—how we tell when taste has been “conditioned” as opposed to learned, developed, cultivated, or refined. If “conformity” implies “conditioning,” then authentic taste requires nonconformity, which is to say diversity; but if authentic taste is found in a diversity of opinion, by what common standard could we ever measure taste’s “sophistication” or “depth”?

Perhaps neither Bourdieu nor Greif takes style seriously enough on its own terms. Every fashion that gains purchase must satisfy or resonate with an aesthetic need, some organic transpiration in human sensibility, and every vector of this new fashion—from the hippest to the lamest—is some part late-adopter and some part innovator. There is a phenomenology of style, of what fundamentally strikes us as hip or cool, I would argue, that we have never fully penetrated or described. To lay it strictly at the feet of marketers and advertisers, to reduce it all to class mechanics and social self-sorting, is to avoid the central, most interesting question: What is the true meaning of aesthetics in our lives and our hearts, whence the germ force of style prior to commodification?

For if we value at all the great sweep of aesthetic diversity around us, we owe the trailblazers and vanguards of style something, whether they look down on us or not. Whether we have hipsters to thank, the dominant urban style—especially in the built environment—has grown more sophisticated and refined over the last twenty years. (Only critics guilty of the same retrospective nostalgia for which they fault hipsters could deny this.) As this aesthetic grows stale and familiar itself, we will need new insurgent styles to upend the merely comfortable and derivative. This is only as it should be. Hipsters caught hell for supposedly never developing a style of their own and instead parasitically drawing on older styles and more authentic subcultures. But is it ever so different? The spontaneous emergence of a name that the culture needed and understood—even a repurposed name tinged with judgment—offers sufficient evidence in itself that a new style had emerged. Besides which, you had to be an incorrigible curmudgeon to claim that postmodern “sampling” in literature and music never created valuable or exciting art.

But if there has been a downside to the rise and diffusion of hipsterism in culture, I would suggest that it lies not in the new hipster’s birth or first flowering—not in the hipster that people loved most to hate—but in the long second life of hipster as a style without an attitude: “hipster nice.” The first new hipsters shared something with the Warhol Factory milieu: a commitment to aesthetics as a vacuous category, as hopelessly implicated in celebrity and consumerism. The ironic honesty of this stance turned it into something halfway serious; by insisting on the vacuity of fashion, it transcended vacuity. It brought dialectical grit to the culture, as perversity always does. What drove people nuts about this original hipster was never vapidity in itself, but the sacrifice of everything—even the pleasures of human warmth—to the imperative of cool. This isn’t my idea of a good life, but it involves a commitment that I respect, an intuition that social life is never wholly separate from performance, whose kernel of truth we forget at our peril in an age overconfident of its authenticity.

The great monocultural wash of pop consensus has not only devitalized subculture but failed to grapple with the harm of feel-good acceptance when it tips into an ethic that refuses necessary distinction and meaningful judgment. A culture that insists on consensus stalls like a locked-up engine. Hipsters got grief for being a subculture without political conviction or heft. But even rigidly superficial movements matter because they destabilize our convictions and remind us of the many possibilities of being. They open up space, dimensionality, in existence. The grand consensuses of modern life online—the politics of approbatory or condemnatory agreement—keep culture from renewing and reinventing itself. When hipster lost its edge and went mainstream, we entered a period of aesthetic and moral stagnation. This wasn’t hipster’s fault, and—dear god—hipster was never going to save us. It is simply what happens when we defang the subversive element in culture, even the stupidly subversive. For all the “niceness” in the air, you would be forgiven for feeling that we have grown meaner, less forgiving, and quicker to judge. In repressing our full psychological response, we have redirected its more virulent and antisocial aspects into channels that we have convinced ourselves—against all evidence—are healthy and good. The best thing about hipsters may have been the very thing we always condemned them for: They didn’t like us.

But were we really so likeable?