Over an icy weekend this past January, thousands gathered in three colossal hotels for a ritual of corporate hiving—a not unfamiliar sight for downtown Chicago. Suited, caffeinated, name-tagged, and sensibly coiffed, the crowds circulated through carpeted hallways, beige lounges, event ballrooms, and conference spaces, engaged in the customary labors of professional networking. To the casual observer this all might have looked like just another gathering of American tradespeople.

But in place of working groups on middle management and boosting trade, the digital placards outside conference rooms announced a different set of concerns: “Muslim Utopia,” “Global Pirandello,” “The Semipublic Intellectual?” The occasion was the 129th convening of the Modern Language Association, or MLA, said to be the largest professional gathering in the world devoted to the study of language and literature, and I was among the sea of fellow professors, adjuncts, lecturers, graduate students, publishers, and administrators assembled to present our most recent research and take the temperature of our increasingly embattled professions.

Judging by the session titles—there were some eight hundred this year—our casual observer would be able to ascertain that for all its business-as-usual undertones—the embossed cards, the Napa Valley wine flowing after dark—the MLA was a bit, well, off. Session 659? “Text-nology Idea Jam: Doing New and Old Things with Old and New Books.” Session 741? “Spectral Scotlands.” A predilection for cute and surprising—or cute because they’re surprising—two-word titles (Session 195: Jewish Monsters; Session 705: Stupid Dickens) contrasts with papers headed by daunting strings of key terms (“Borgesian Crossroads: Bible Dissemination, Orality, and the Latin American Postcolonial Condition/Tradition”). Hyphenated neologisms are popular (“Homo-reproductions,” “Not-yet-activism”) as is the use of “queer” as a verb. Some panel titles pun and promise humor (Session 805: In the Meme Time), some papers sound technical and dry (“Computational Tracking of Cross-Cultural and Cross-Lingual Markers in Translation Corpora”), others trendy—the term “vulnerable” and its cognates shows up regularly, e.g. Session 784: “Tumblr Vulnerabilities.” And sometimes I have no idea what’s going on. What could “Remediating LidantJU fAram” be about?

Making this year’s tour through the MLA got me thinking, oddly enough, about The Jam, a mod revival group that thrived two decades after the first mods appeared, and my favorite band in early adolescence. In some ways it was a typical obsessive adoration, though through the years—today I’m forty-four—I’ve kept on listening, I think because they provided me my first experience of kinship with a subculture and of resisting the mainstream. I was a mildly privileged white American teen from New Haven, Connecticut, with little standing in the way of a comfortable youth. I wasn’t a working-class Brit seeking an alternative path to upward mobility, as the first mods generally were. No “foreman Bob” was pushing me around at work, as “Billy Hunt,” The Jam’s great song of workplace disgruntlement, complains about. In fact, I was more likely on track to become the sort of proper white-collar conference-attending sod at which The Jam hurled invective (“And if I get the chance, I’ll fuck up your life!” frontman Paul Weller screams at the corporate cog “Mr. Clean”). I responded to mods so forcefully at fourteen because they looked cool but weren’t all that threatening; they allowed me to identify with insurrectionary power and restlessness without requiring me to relinquish the comforts of accessible and formally satisfying tunesmithing. Weller fumed and caterwauled at the microphone and his trio spat out rebellious mini-manifestoes, but the songs themselves were tightly crafted and salable pop gems. In other words, they attacked the world of adult attainment but left it delicately rather than grossly transformed.

Conferences remind me a little of those rock bands that want to thrash, yet also yearn, as they grow older, for a little respect and attention from the mainstream (if you’ve ever seen Aerosmith play “Dream On” for MTV’s 10th anniversary, backed by a full orchestra, you know what I’m talking about). The Jam, and the first-generation mods after whom the band was patterned, serves today as an intriguing analogue to the academic world of literary studies on display at the MLA: both perform balancing acts, between subversion and rebellion on the one hand and professional respectability on the other. Mods and literary academics are caught between the allure of wildness, ingenuity, and nonconformity and the desire for some sort of stability, recognition, and achievement.

Mostly working-class kids, they railed against the system because, deep down, they wanted their share of its bounty.

Who were the mods? (The name is short for moderns, as in modernists.) They’re perhaps best remembered in terms of style: dark and sharply tailored suits, black one-inch ties, short-cropped hair, sporty Baracuta G9 jackets, Fred Perry polos and Ben Sherman shirts, and—most memorably—Vespa scooters. And yet mod wasn’t really about style itself; rather, it was a stylized response to the postwar decline of British social coherence. At the end of its years of empire, and after years of economic slumping and conservative rule, Britain was transitioning away from traditional class and generational stratification into a modern consumer economy that provided avenues for social transformation. Mod style communicated this and provided a means to parody and subvert respectability yet also aspire to it. As Dick Hebdige wrote in his influential 1979 study of subcultures, mods turned the paraphernalia of middle-class urban life toward subversive ends, though they left it generally recognizable: the business suit was revised roguishly, but almost imperceptibly, into hipster wear; the sensible commuter’s scooter, subtly tricked out, became an emblem of both youthful independence and subcultural affiliation; and the amphetamine pill (mod drug of choice) was turned from coping agent to means of recreational elation.

Unlike other subcultures—skinheads, Teddy Boys, and Rude Boys in Britain; hippies, Beats, and bikers in the States—mods were something radically different: in their bespoke suits and careful haberdashery, they looked sharp because, to some extent, they were desirous of great things. Like we conference-goers aim to be, they were professional. Indeed, mods had jobs (remember “Billy Hunt”?). And unlike so many later subcultures that announced themselves in absolutely oppositional terms—punks, say, whose aggressively shredded look served as a general fuck you—mods didn’t hide the fact that they shopped, and cared about what they bought. While they attacked middle-class office drudgery and lily-white respectability, they also rejected a rigid British class system that denied them access to a life of consumer luxuries and services—a life brimming with the very stuff work would enable them to purchase. Mods were, in short, a half-rebellious youth subculture that kept one eye trained on the rewards of adulthood. Mostly working-class kids, they railed against the system because, deep down, they wanted their share of its bounty.

These days, literary studies functions much like a subculture—that is, a social alliance that uses style to resist assimilation into a bland, repressive, or restrictive mainstream, a mainstream that’s found today in the steady corporatization of American higher education. Like a mod, today’s professionalized literary humanist seeks square-shouldered resistance to the forces of capital, consumption, and Mr. Clean-style business bottom-linism, which is increasingly infecting our colleges and universities, and leading to what is often, and appropriately, referred to as a “crisis in the humanities.” While we humanists typically labor against those forces—our work isn’t lucrative, it’s often obscure, if political it’s generally on the far left—we also have to be consummate professionals: business-ready; crisp and caffeinated; productive; answerable to roving deans, review committees, and the university bureaucracy itself, which requires legibility and measurable success.

How can we be successful professionals, based in accountable institutions, while also embodying the original and fundamentally non-institutional work we do?

For the mods, catalogues of dissipated living and critiques of middle-class decorum were often held in check by a considerate and more mature superego. In “To Be Someone,” The Jam’s masterpiece from the album All Mod Cons, Weller recalls a life of rock-‘n’-roll adventuring, full of cocaine and “swimming in a guitar-shaped pool,” and regrets blowing his assets on high times. It’s the head-smacking song of a thirty-something: I should’ve saved! And yet the chorus, spat out and savored by Weller, asks whether perhaps it was worth it in the end:

But didn’t we have a nice time?

Didn’t we have a nice time?

Oh, wasn’t it such a fine time?

Wait—was it worth it? He did have a “nice time,” yes…but the middle-class gloom against which The Jam pitches itself seems here to provide the very values by which the band judges its rebellion’s failures. “I realize I should have stuck to my guns,” Weller cries out later in the song, “instead shit out to be one of the bastard sons, and lose myself.” It’s a song both of celebration and regret.

To be a literary scholar is to feel a similar conflict: on the one hand, there is the adventure of literature, its beauty, its ambiguous and sometimes baffling conclusions, the deep truth that drew us all into reading so intensely in the first place. On the other hand are the more orderly responsibilities our professional identities demand. How can we be successful professionals, based in accountable institutions, while also embodying the original and fundamentally non-institutional work we do? Our job, after all, is inherently rebellious. No, we don’t find ourselves tweaked on coke and swimming in guitar-shaped pools (at least I don’t)—yet we do counter mainstream narratives, unsettle settled doctrine, work slowly with strange, unruly, sometimes bizarre texts that take work to decipher, and discover and describe that which hasn’t yet been codified. How do we reconcile the spirit of originality and resistance with the constriction and legibility that professional life demands? And when our profession—in a mounting “crisis,” as we increasingly hear—gathers every January at the MLA, should we be looking more energetically for answers to this question?

* * *

Professors in the humanities are, of course, aware of this crisis and it’s for this reason that the cautious, coded stratagem I describe above has emerged. At the MLA, the preferred speaking style is crisply pitched and polysyllabic, hyper-articulation at a snappy pace. Artful, pre-conceived phrases are tossed out as though already part of our general discourse, and the absence of pauses or definitions indicates to listeners that they should know, and know quickly, what this new language is all about. One presenter this year uttered within the space of a few breaths the phrases “affective possibilities of urban space” and “Harlem as a geographical terrain is a palimtextual space.” I got his point, but it’s that rapid yoking of disparate terms—“geographical terrain” and “palimtextual space”—that serves, perhaps, as the archetypical meme of conference-speak. Typically, this is accomplished with an “as” (i.e. Session 518: Age and/as Disability), which captures the scope of conference-talk: big ideas, colliding fast.

This isn’t the way we talk in “normal life,” but here? It’s how we do.

In a groundbreaking 1955 essay, the criminologist Albert K. Cohen emphasizes two key qualities of any subculture: the significant role social interaction plays in creating the subcultural group, and the group members’ sense of themselves as outsiders who are being asked to “adjust” to the culture at large but don’t wish to. People struggle to conform, Cohen says, to a consensus of norms, beliefs, and values, and “he who dissents, in matters the group considers important, inevitably alienates himself to some extent from the group.” And yet people wish to dissent, for whatever reason—they have their own ideas, thoughts, and beliefs, God love ’em—and they don’t want to adjust, as Cohen puts it, to reigning groupthink.

The “crucial condition” for the emergence of a subculture is—and Cohen emphasizes this bit with italics—“effective interaction with one another, of a number of actors with similar problems of adjustment.” That is, subcultures are born out of alienation and feed on social reinforcement; they don’t just pop up in someone’s bedroom, or on the back of a stylishly dressed teen. They require face-to-face scenes of shared maladjustment, agreement, and in-group approval, which conferences provide like nothing else in academia. So more important than that actual paper on Harlem’s geographical palimtext is what happened after it was read: the speaker received enthusiastic questions that implied the worthiness of his ideas; colleagues approached him with outstretched hands; email addresses were exchanged, smiles were flashed, and drink dates were made. It was, in short, an “effective interaction.” These are people who can’t quite adjust to a variety of things, like standard ways of discussing literature, and also who have a sense that what is at issue in literature is crucial for, and undervalued in, our commercial culture.

Conferences can still be alienating affairs. I feel both a part of our ill-adjusted cohort, conversant in and comfortable with conference-speak, and yet also very conscious of its performed, almost theatrical quality. The heightened diction is exhilarating but exhausting too. In a way it is subcultural code, a mode of speech that marks us off as a group. It requires experience (and indeed training) if one wants to follow along. It’s not merely what’s said that’s significant to subcultural identity but also that what’s said can be articulated or catalogued with such bewildering speed.

The most common gripes about literature conferences, and talk in literary studies more generally, concern style and format: densely written papers read aloud at lecterns that can be difficult—and dull—to follow; too many big ideas, yoked by too many an “as,” colliding too fast for listeners to comprehend. (See, for instance, Nicholas Kristof’s recent Times column “Professors, We Need You!” calling for less jargon and more accessible, useful ideas from the professoriate.) A secondary gripe is that the Internet has obviated this ritual of conference-going. And most of us do recognize that these conferences don’t actually accomplish much beyond reminding us that we exist.

But such talk is also moving. It acts bravely against the threat of irrelevance that hangs over the humanities these days. I feel it. Everyone at the MLA feels it: the palpable crisis in the humanities. For one thing, undergraduate enrollment in English departments is down. Between 1970 and 2004, the percentage of English majors dropped almost by half. (By comparison, the percentage of business majors has increased by almost sixty percent over the same period.) Or consider this: in 2010, the University of Albany cut its French, Italian, Russian, and classics programs, sparking a debate about dwindling institutional support for the humanities; and in 2012, the governor of Florida, Rick Scott, convened a task force to study university funding and concluded that humanities majors should pay higher tuition at state schools because these are not “strategic disciplines.” Even Rosemary Feal, the current executive director of the MLA, has pointed to “a general devaluing of the humanities” by legislators intent on slashing budgets. No one example tells the full story, but together they sketch the contours of a real problem.

Why is this happening? It’s unclear. From the conservative right we’ve been hearing for some time now—from Allan Bloom in the 1980s and John Ellis in the 1990s to Harold Bloom all the time—that the problem is with the modern humanities curriculum, which is so diverse and identity-driven that students don’t feel drawn to literature. An excessive focus on race, class, and gender, goes this argument, has turned the study of literature into a study of identity politics. It is also common now to cite the increasing attraction of STEM fields (science, technology, engineering, and math) for undergraduates aiming themselves at Google internships. In a technology-driven world, what is a humanities degree really worth? While the salability of the BA in English is routinely defended by education professionals—e.g., it teaches critical-thinking skills, it offers cultural capital attractive to all kinds of future employers, it’s great prep for law school—questions mount yearly regarding the dollar-for-dollar value of the liberal arts.

Impassioned defenses of literature, reading, and culture do crop up all the time. The New Yorker’s Adam Gopnik divided them into two camps, “one insisting that English majors make better people, the other that English majors (or at least humanities majors) make for better societies.” Both camps maintain that literature ensures a culture of civilized and productive citizens, ones who put forth “the best that has been thought and said,” as Matthew Arnold famously put it. And yet there is also the uncomfortable fact of literature’s decline as cultural capital, a point made in magisterial detail by John Guillory, who argues that universities now educate a techno-bureaucratic class uninterested in the prestige books once promised. All the canon wars—fought over what books should and shouldn’t be taught to undergraduates—overlook, said Guillory, the larger question of literature’s diminishing gravitas. Who cares about tweaking the syllabus when no one is taking the class?

The relevance of the literary humanist is itself the subject of a virtual subgenre of books, many of which argue that universities have ceded authority to commercial forces. Andrew Delbanco, an eminent Americanist and cultural critic at Columbia University, recently claimed that the notion that universities have an obligation to teach “the fulfilled life,” one that is derived at least in part from the humanities, has been steadily losing ground to the forces of capital and wealth and advancement. The liberal-humanist ideal of college as “a community of learning,” Delbanco says, in which students get a chance to engage in the intellectual activity of soul-shaping, has given way to college as a credentialing machine. And if providing professional credentials is the primary purpose of a college education then an English degree seems a dubious proposition.

College has turned its attention from the business of contemplation to just business itself.

Perhaps the spriest version of this argument is made in Mark Edmundson’s Why Teach? In Defense of a Real Education. Edmundson is galled not just with the marketing of higher education (glossy catalogues, glossy websites) but with the market ethos of his recent students, who expect to consume a lecture on Freud much in the same way they would a Hollywood film—that is, without much strenuous intellectual effort on their part. They expect a format that takes into account their inattention. “Most of my students seem desperate to blend in, to look right, not to make a spectacle of themselves,” Edmundson complains. “The specter of the uncool creates a subtle tyranny. It’s apparently an easy standard to subscribe to, this Letterman-like, Tarantino-inflected cool, but once committed to it, you discover that matters are rather different. You’re inhibited from showing emotion, stifled from trying to achieve anything original.”

It’s this atmosphere of commercialism and conformity that drives people to create subcultures. Literature and discussions about literature require all of the virtues being razed by the increasingly corporatized college campus and student body: enthusiasm, earnestness (at least to a degree), impassioned and lengthy engagement, and strange talk. On the day I wrote this paragraph, the New York Times carried a story titled “In a Buyer’s Market, Colleges Become Fluent in the Language of Business.” “Higher education,” says the Times, “is today less a rite of passage in which institutions serve in loco parentis, and more a commercial transaction between school and student.” College has turned its attention from the business of contemplation to just business itself.

Today’s literary professionals are caught between institutional professionalism and the subversive impulse that brought them to literature in the first place. Whereas the mods were defiantly stylish and mildly rebellious while longing for a greater degree of respectability, English professors are respectable and professional while longing for the passion and engagement and excitement of literature, the Thing Itself of reading. This Thing—call it insight, inspiration, or truth—whatever it was at first, it wasn’t about wanting to publish papers. It wasn’t about wanting to be right. It wasn’t about wanting to talk well or talk fast. It wasn’t about wanting an open schedule, praise for one’s own thoughts, or (though I’m not sure about this one) sex with admirers. It was about literature telling us something we weren’t getting anywhere else. We stayed up late thinking about it and it changed the way we saw the world the next morning.

Why is conference-talk so strange and unsettling? Because literature conferences are where we literary professionals go to dramatize the state of our profession, one caught between institutional acumen and anarchy. It’s not that all scholars are agitators or would-be agitators, producing willfully obscure texts. But the more one listens in on conference-speak, the more it becomes clear that the virtues of professionalism—clarity, accountability, a “takeaway” message—are in standing conflict with the desire to sound like something other than an employee with a product. To sound like something other than a base explicator. That fight, between the vibrancy of reading and the drabness of professionalism, is the dramatic stuff of conference rooms. It’s what we’re doing there. We’re fighting with ourselves.

* * *

Typically, subcultures disappear. They vanish at the very moment they fully materialize—once Urban Outfitters begins selling your look, it’s over. In fact, subcultural identity balances on the knife’s edge of mass culture, requiring a normality against which to define its own oddness. Without vanilla mallscapes, tax paperwork, and tract housing, subcultural resistance wouldn’t stand a chance. And it’s always shocking to witness the speed and proficiency with which commodity culture is able to appropriate, repackage, and thereby sanitize, any force in opposition to the monoculture. This happened, of course, to the mods. Paul Weller is now known, in Great Britain at least, by the humiliating moniker “the Modfather.” He’s become a fashion icon and online articles about him typically include links to shoe websites (many of which—full disclosure—I’ve clicked on). That’s fine for Paul Weller. It’s just that mod has become entirely dissociated from its origins as a subculture with something to say about the dissatisfactions of British postwar life. This isn’t surprising, but it rankles. Still, it happens to all subcultures.

Almost all subcultures. For if literary studies is a subculture in the ways I’ve described, then why hasn’t it been bleached out and denuded of any critical function? One simple answer is that these days it’s not nearly hip enough to be appropriated. But there was a time when English professors had their fingers on the pulse of American culture. Lionel Trilling’s 1950 book The Liberal Imagination—with dense chapters on Henry James, Mark Twain, and F. Scott Fitzgerald—sold close to 200,000 copies, a blockbuster by any standard.

Literary studies as subculture is, at times, as susceptible to mass appropriation as were the Sex Pistols. Remember deconstruction? When I was in college, it was actually cool to talk about it. You read J. Hillis Miller, you read Paul de Man. You spent whole Saturdays lying on a couch, peering into those plain-covered volumes, seeking the flames of life-altering ideas. And then, just as you figured out that “getting it” was, in a way, precisely what deconstruction was skeptical of (in fact, there was no “it”), you noticed it showing up, incorrectly used and simplified beyond recognition, in the world outside your seminar room. Sportscasters used it as a synonym for the thoughtful offenses that “deconstruct” a defense; friends used it in place of “analyze.” And then it was on NPR and in the New York Times and in vodka ads and it was embarrassing to even say, not only because it was no longer a secret, but also because the idea had become a caricature of what it once was. It’d been cut off from its source of authenticity.

But these moments are infrequent. One reason is that for every act of linguistic rebellion there is a camouflaging professionalism to square things off: sensible tailoring, reasonable-sounding job titles, tenure processes, and academic conferences. All wayward or original thought is masked by the package in which we usher it into the world.

The punks created a style that was supposedly inimical to commodification but, of course, the fashion industry got ahold of it anyway.

Also, unlike most subcultures, professional humanists aren’t selling anything—or rather, the products are pretty hard to commodify. Not every subculture tries to sell out; the mods may not have meant to hawk wares, but they did—clothes, scooters, a lifestyle. The punks created a style that was supposedly inimical to commodification but, of course, the fashion industry got ahold of it anyway (the 2013 exhibit Punk: Chaos to Couture at the Metropolitan Museum of Art made this plain enough). Subcultures are generally quite marketable; scouts are sent out to find and monetize youth cultures around the world, and this cycle, from innovation to appropriation, is a familiar one. But the products of humanistic thought are cumbersome. So we’ve been left to ourselves. And we’ve dug our heels into the stolid institutions of mainstream American culture, securing time to research, write, and talk, securing salaries, space, and sometimes respect. We’re a subculture and also the culture itself. And our uncoolness is, in effect, our preservative.

* * *

The humanities aren’t going away, but they may be going underground. The crisis is driving them there. Due to the tendency to view college education in solely practical terms—in terms of value, investment, and professional preparation—humanists will need to prioritize their professional roles and muffle the aesthetic and the intellectual. We’re in the position, these days, of having to account for ourselves. Here is Daniel Schwarz, a Cornell English professor, defending the value of the English major in an article for the Huffington Post. After being asked at a public talk what his former students have gone on to do, Schwarz responds that many of them are actually in the audience and he asks them to answer:

“I graduated from Harvard Law school and now work for the city of New York”; “I am at MOMA working on foundation relations after doing an MA in museum studies at NYU”; “I work at Christie’s as a junior specialist in European furniture, porcelain, and decorative arts, after completing a Magister Litterarum degree—accredited through the University of Glasgow—from Christie’s Education”; “I am working in hospital administration”; “I work in the financial industry”; “I am preparing to take the law boards in a few months and am working as a paralegal”; “I am an editor in a major publishing house”; “I am a professor of English at a branch of CUNY”; “I am in medical school in New York”; and so on.

“Accredited through the University of Glasgow”? We’re so desperate to claim respectability that we fall back on accreditation as a mark of self-worth. Make no mistake: it’s great that Schwarz’s former students are doing so well, and in so many fields. But with the cultural emphasis falling so heavily on the side of financial and comparable pursuits, the typical humanist defense of a liberal arts education—Literature feeds the soul! The study of art produces ethical citizens!—is something that needs to be kept quiet, or at least encoded so it’s not plain for those who are listening.

Conference-speak, and more generally the way we have learned to talk about books and art in our institutional warrens, is one way to deal with this necessary compromise. “I want nothing this society’s got, I’m going underground,” sings Weller in one of The Jam’s greatest songs. It might be the statement of any subculture—we’re doing things on our own, out of sight, where we can make noise. Subcultures exist to serve those who are having problems with what Cohen calls “adjustment” to culture at large—to provide the marginalized some kind of social solidarity. We use allusions and long words, we speak quickly, and we make long and decidedly uncool arguments about texts that are often challenging and unfamiliar.

We are operating, in many ways, underground. But it should not be forgotten that that song—“Going Underground”—celebrating withdrawal and deliberate alienation, was an instant hit. It quickly rose to number one on the UK Singles Chart, where it remained for three weeks in a row.