Ready to fight back? Sign up for Take Action Now and get three actions in your inbox every week. You will receive occasional promotional offers for programs that support The Nation’s journalism. You can read our Privacy Policy here. Sign up for Take Action Now and get three actions in your inbox every week.

Thank you for signing up. For more from The Nation, check out our latest issue

Subscribe now for as little as $2 a month!

Support Progressive Journalism The Nation is reader supported: Chip in $10 or more to help us continue to write about the issues that matter. The Nation is reader supported: Chip in $10 or more to help us continue to write about the issues that matter.

Fight Back! Sign up for Take Action Now and we’ll send you three meaningful actions you can take each week. You will receive occasional promotional offers for programs that support The Nation’s journalism. You can read our Privacy Policy here. Sign up for Take Action Now and we’ll send you three meaningful actions you can take each week.

Thank you for signing up. For more from The Nation, check out our latest issue

Travel With The Nation Be the first to hear about Nation Travels destinations, and explore the world with kindred spirits. Be the first to hear about Nation Travels destinations, and explore the world with kindred spirits.

Sign up for our Wine Club today. Did you know you can support The Nation by drinking wine?



Brain scan image from Libertas Academica Ad Policy

At Stanford University in 2012, a young literature scholar named Natalie Phillips oversaw a big project: a new way of studying the nineteenth-century novelist Jane Austen. No surprise there—Austen, a superstar of English literature and the inspiration for an endless array of Hollywood and BBC productions based on her work, has been the subject of thousands of scholarly papers.

But the Stanford study was different. Phillips used a functional magnetic resonance imaging (fMRI) machine to track the blood flow of readers’ brains when they read Mansfield Park. The subjects—mostly graduate students—were asked to skim an excerpt and then read it closely. The results were part of a study on reading and distraction.

The “neuro novel” story was quickly picked up by the mainstream media, from NPR to The New York Times. But the Austen project wasn’t merely a clever one-off—the brainchild, so to speak, of one imaginatively interdisciplinary scholar. And it wasn’t just the result of ambitious academics crossing brain science with “the marriage plot” in unholy matrimony simply to grab headlines. The Stanford study reflects a real trend in the humanities. At Yale University, Lisa Zunshine, now a literature scholar at the University of Kentucky, was part of a research team that studied modernist authors using fMRI, also in order to better understand reading. Rather than a cramped office or library carrel, the researchers got to use the Haskins Laboratory in New Haven, with funding by the Teagle Foundation, to carry out their project, in which twelve participants were given texts with higher and lower levels of complexity and had their brains monitored.

Duke and Vanderbilt universities now have neuroscience centers with specialties in humanities hybrids, from “neurolaw” onward: Duke has a Neurohumanities Research Group and even a neurohumanities abroad program. The money is serious as well. Semir Zeki, a neuroaesthetics specialist—that is, neuroscience applied to the study of visual art—was the recipient of a £1 million grant in the United Kingdom. And there are conferences aplenty: in 2012, you could have attended the aptly named Neuro-Humanities Entanglement Conference at Georgia Tech.

Neurohumanities has been positioned as a savior of today’s liberal arts. The Times is able to ask “Can ‘Neuro Lit Crit’ Save the Humanities?” because of the assumption that literary study has descended into cultural irrelevance. Neurohumanities, then, is an attempt to provide the supposedly loosey-goosey art and lit crowds with the metal spines of hard science.

The forces driving this phenomenon are many. Sure, it’s the result of scientific advancement. It’s also part of an interdisciplinary push into what is broadly termed the digital humanities, and it can be seen as offering an end run around intensifying funding challenges in the humanities. As Columbia University historian Alan Brinkley wrote in 2009, the historic gulf between funding for science and engineering on the one hand and the humanities on the other is “neither new nor surprising. What is troubling is that the humanities, in fact, are falling farther and farther behind other areas of scholarship.”

Neurohumanities offers a way to tap the popular enthusiasm for science and, in part, gin up more funding for humanities. It may also be a bid to give more authority to disciplines that are more qualitative and thus are construed, in today’s scientized and digitalized world, as less desirable or powerful. Deena Skolnick Weisberg, a Temple University postdoctoral fellow in psychology, wrote a 2008 paper titled “The Seductive Allure of Neuroscience Explanations,” in which she argued that the language of neuroscience affected nonexperts’ judgment, impressing them so much that they became convinced that illogical explanations actually made sense. Similarly, combining neuroscience with, say, the study of art nowadays can seem to offer an instant sheen of credibility.

But neurohumanities is also the result of something else. Neuroscience appears to be filling a vacuum where a single dominant mode of thought and criticism once existed. That plinth has been held in the American academy by critical theory, neo-Marxism and psychoanalysis. Alva Noë, a University of California, Berkeley, philosopher who might be called a “neuro doubter,” sees neurohumanities as a reaction to the previous postmodern moment. “The pre-eminence of neuroscience” has legitimated an “anti-theory stance” within the humanities, says Noë, the author of Out of Our Heads.

Noë argues that neurohumanities is the ultimate response to—and rejection of—critical theory, a mixture of literary theory, linguistics and anthropology that dominated the American humanities through the 1990s. Critical theory’s current decline was somewhat inevitable, as all intellectual movements erode over time. This was exemplified by the so-called Sokal affair in 1996, in which a physics professor named Alan Sokal submitted a hoax theoretical paper on science to Social Text, only to unmask himself and lambaste the theorists who accepted and published his piece as not understanding the science. Another clear public repudiation was the harsh Times obituary in 2004 of the philosopher Jacques Derrida, who was dubbed an “abstruse theorist”—in the obit’s headline, no less. But as critical theory’s power—along with that of Marxism and Freudianism—fades within the humanities, neurohumanities and literary Darwinism are stepping up, ready to explain how we live, love art and read a novel (or rather, how the cortex absorbs text). And while much was gained as “the brain” replaced “individual psychology” or social class readings, much has also been lost.

Critical theory offered us the fantasy that we have no control, making a fetish of haze and ambiguity and exhibiting what Noë terms “an allergy to anything essentialist.” In neurohumanities, by contrast, we do have mastery and concrete, empirical ends, which has proved more appealing, even as (or perhaps because) it is highly reductive. At least since George H.W. Bush declared the 1990s the decade of the brain, the media have been flooded with simplistic empirical answers to many of life’s questions. Neuroscience is now the favored method for explaining almost every element of human behavior. President Obama recently proposed an initiative called Brain Research Through Advancing Innovative Neurotechnologies, or BRAIN, to be modeled on the Human Genome Project. The aim is to create the first full model of brain circuitry and function. Scientists are hoping that BRAIN will be as successful (and as well funded) as the Human Genome Project turned out to be.

* * *

There are things that neuroscience is useful for, in terms of understanding behavior, but there are also things it is not all that useful for, like understanding the nuances of our reactions to poetry.

Literary studies before the advent of the neurohumanities tended to rest on murkier categories than science likes—categories such as subjectivity and interpretation. In a novel like Mansfield Park, for instance, the heroine Fanny Price is cornered into servility by both her social class and her feminine role. The judgmental Fanny smiles upon conservative social formations and condemns most others. This has led to vastly different interpretations. Lionel Trilling famously wrote that the novel’s “praise is not for social freedom, but for social stasis,” but it has also been read as feminist: a “bitter parody of conservative fiction,” in the words of Princeton University Austen scholar Claudia Johnson.

That is not to say that all neurohumanities scholars are insensitive to nuance and ambiguity. Some, like Lisa Zunshine, combine neuroscience with original interpretations of consciousness and multiple points of view in modernist novels. But other neuroaestheticians offer blunt accounts of areas of study that have long been appreciated for their complexity, such as the meaning of art or aesthetics as a means of transmitting politics and interpretation. In other words, some underlying principles of neuroscience are useful when applied to the humanities, but it needs to understand its limits.

Neuroaesthetics, an au courant mix of art history and cognitive science, asks whether our brains are structured so that paintings and precious objects move us in one way or another: one neuroimaging study, conducted at University College, London, set out to explain how we experience beauty in visual art. Ten people were shown 300 paintings while their heads were in an fMRI machine. They were asked to label the paintings as neutral, beautiful or ugly. The paintings they thought were beautiful led to increased activity in their frontal cortex, while the ugly paintings led to a similar increase in their motor cortex.

Professor Semir Zeki at UCL was responsible for this study, which he conducted through the Institute of Neuroesthetics in London and UC Berkeley. The center sets out a bold claim on its website: “the artist is, in a sense, a neuroscientist exploring the potentials and capacities of the brain, though with different tools.” Zeki’s latest paper? “The Neural Sources of Salvador Dali’s Ambiguity.”

* * *

But are multiple—and politically minded—meanings possible in the land of the neuro novel or neuro-aesthetics? Is neurohumanities, like “neuromarketing,” simply trying to help us understand and then produce cultural artifacts that will have the best effect for readers, writers and artists—political and historical context be damned?

The response to this question depends on whom you ask. Some are suspicious of what has been called “neuro-reductionism.” Jonathan Kramnick, a Rutgers University English professor who wrote a provocative essay, “Against Literary Darwinism,” for the journal Critical Inquiry, notes the rise of books with titles like The Art Instinct. “There’s an attention to the fine grain of a text that neuroscience can’t get at,” he says.

“Humanists are unwilling or unable to evaluate the science, so we just take scientists’ word for it, without following up on the evidence or knowing these claims are highly contested within their community,” says Todd Cronan, a professor of art history at Emory University. “‘Mirror neurons’ are highly debatable, yet art historians now just apply them to artworks. I think it’s worrying. And when there’s a ‘call for collaboration’ between art scholars and neuroscientists, we just marshal the scientists’ evidence.”

Jennifer Ashton, an English professor at the University of Illinois, wrote a takedown of neuroaesthetics in the academic journal Nonsite in 2011. She put it like this: “How your brain is firing won’t tell you if something is ironic, metaphorical or meaningful or if it is not.”

What does it mean, Cronan wonders, “if Matisse uses a lot of red and a neuro person says, ‘Red produced neuronal firing’?”

Literary Darwinism, another route by which the language and analytical frame of science has entered the humanities, can have an even more formulaic aspect. In one study, Jonathan Gottschall, an evolutionary lit scholar, compared 1,440 folktales from nearly fifty cultures in order to counter feminist critics and the assumption “that European tales reflect and perpetuate the arbitrary gender norms of western patriarchal societies.” His finding: there are biosocial norms that all cultures perpetuate—i.e., the feminists are wrong.

Critics also point out that neurohumanities scholars prefer formally conservative artists. Such artists are more likely to help them make general points about beauty or the act of reading: Austen or Michelangelo, for instance, were both animated by classical values like order and symmetry. And while neuroimaging may help us understand what our mind does when we read quickly or with a more careful attention, these data sets tell us next to nothing about the actual literature, nor do they give us a political understanding of a text.

It’s not hard to imagine a future when neurohumanities and neuroaesthetics have become so adulated that they rise up and out of the academy. Soon enough, they may seep into writers’ colonies and artists’ studios, where “culture producers” confronting a sagging economy and a distracted audience will embrace “Neuro Art” as their new selling point. Will writers start creating characters and plots designed to trigger the “right” neuronal responses in their readers and finally sell 20,000 copies rather than 3,000? Will artists, and advertisers who use artists, employ the lessons of neuroaestheticism to sharpen their neuromarketing techniques? After all, Robert T. Knight, a professor of psychology and neuroscience at Berkeley, is already the science adviser for NeuroFocus, a neuromarketing company that follows the engagement and attention of potential shoppers. When neuroaesthetics is fully put to use in these ways, it may do as Alva Noë said: “reduce people and culture to ends, simply to be manipulated or made marketable.”

And he has a point. Today, there’s the sudden dominance of so many ways to quantify things that used to be amorphous and that we imagined were merely expressive or personal: Big Data, Facebook, ubiquitous surveillance, the growing use of pharmaceuticals to control our moods and minds. In other words, neurohumanities is not just a change in how we see paintings or read nineteenth-century novels. It’s a small part of the change in what we think it means to be human.

Philosopher Thomas Nagel’s broadside against Darwinism and materialism is mostly an instrument of mischief, wrote Brian Leiter and Michael Weisberg in “Do You Only Have a Brain?” (Oct. 22, 2012).