I know it’s been a long time since I blogged—really blogged, you know, in the style of that form—for three reasons. First, because I’m talking about blogging in the first sentence, and second because I’m sending you here to read the prerequisites for this post. You’ll want to read the linked piece and as many of the subsequent pieces linked therein as you have time/tolerance for before reading this, because I’m going to do that bloggish thing in which I don’t summarize anything but assume that you’ve been in the flow of the conversation all along. And third, because I’m not going to make an argument here. Instead I’m going to leave a series of semi-related notes on the matter, annoyingly framed in terms of a number of isms.

Ludologism

This Errant Signal video offers a great overview of the dreaded ludology/narratology debate in game studies. But watching it made me want to revisit an observation I once made, namely that the very idea of putting these two terms together assumes a kind of formalism in the first place. Narratology isn’t just the study of story, it’s the study of the formal structure of storytelling itself, emerging out of structuralism. Thus, Gonzalo Frasca’s call for developing a ludology for the study of games akin to the general structural science of storytelling we call narratology can legitimately be seen as a call for formalization. (The fact that the earliest narratologists were the Russian Formalists is no accident here.)

Michael Mateas and others have suggested the alternate term narrativist to represent the “real” opponent to a (formalist) ludology. The problem is, nobody really wants to fly that flag either—it only serves as a term of scorn to distinguish an opposing position. Purported narrativists like Janet Murray and Henry Jenkins were actually interested in the structure of computation more broadly (Murray) or in audience reception (Jenkins).

In any event, there’s an historical matter to consider, too. Back in the late nineties and early aughts, game studies’s primary goal was to establish itself as a stand-alone field. It’s understandable that attempting to create a new field would entail the formation of formal distinctions between the target object of study for that field and its apparent overlaps in other fields. This was always (partly) a formalist project, both in the traditionally structuralist sense of the term and in the looser, more art historical sense of distinguishing the ways different works are made and, in the case of games, operate. And in positing two equally formalist alternatives, ludology or narratology, the field-building exercise of the LvN debate succeeded, even if in part by means of the so-called ludologists tricking their “opponents” into entering a debate whose terms they didn’t actually embrace.

It’s also worth reminding ourselves that the traditions of formalism in general and narrative/literary formalism in particular find much of their origins and ongoing interest in Northern Europe, where game studies also got its start. And it’s also probably worth noting, without suggesting conclusion about, the fact that many of the key actors in this original debate, including Frasca, Murray, and Jenkins, have left game studies almost entirely. The belief in a field of study for games—call it ludologism, maybe, turned out to be somewhat undesirable at the end of the day. I’ll come back to that matter soon enough.

Formalism

Brendan Keogh cites Susan Sontag, as follows:

…considering ‘form’ is far more interesting than being a ‘formalist’…

We could probably end here and move on. If you’d like to do that, I wouldn’t blame you. This is really the right answer, and it’s the right answer for the same reasons that poststructuralism came along to remedy the immoderacy of structuralism: being an anythingist is usually trouble, at the end of the day, but it’s particularly troublesome when making top-down, universalizing claims about things. But also remember that poststructuralism never abandoned structuralism’s obsession with formalism and structure, it just pointed out the varied and contingent natures of those structures. This is, in some very literal sense even, all that the supposed critics of the purported era of game formalism are after. They should be gratified to know that there is fifty years worth of thinking on the matter on their side.

A different way of thinking about Sontag’s point: form really just means material, and formalists are thus really just materialists (not the Marxist sense, not necessarily anyway). So what are the materials games are made of? Lots, of things actually. Systems, stories, consumer goods, byte code, software-hardware confluences, locuses of identity, objects of media discourse. Once I called videogames a mess (in the John Law/actor-network theory sense) sense for that very reason.

It’s interesting to note that elsewhere in cultural criticism, poststructuralism led to a many decades long rise of the hermeneutics of suspicion, which is now (finally) cracking a bit in the world of literary studies. Here’s the literary historian Rita Felski on the matter:

Problematizing, interrogating, and subverting are the default options, the deeply grooved patterns of contemporary thought. “Critical reading” is the holy grail of literary studies, endlessly invoked in mission statements, graduation speeches, and conversations with deans, a slogan that peremptorily assigns all value to the act of reading and none to the objects read.

Sharon Marcus and Stephen Best have coined the term “surface reading” for a new mode of literary analysis that rejects this immoderacy. It’s a term deliberately opposed to “close reading,” the currency of the realm of symptomatic hermeneutics. Surface reading is meant to address “what is evident, perceptible, apprehensible in texts,” a technique that requires an attention to the material reality of the work. Surface reading is also sometimes grouped with New Formalism, one of several new or newly-popular approaches to literary criticism focused on the phenomenal experience of reading. It’s interesting to note that in literary studies, new formalisms are seen as reclamations of the benefits of material and structuralist analysis lost to poststrucutralism, new historicism, and so forth, while in game studies, formalism is seen as just the opposite.

If anything, the conflict at hand in games looks a lot more like the dispute between New Criticism and New Historicism than anything else: a focus on the construction and aesthetics of the work versus a focus on the cultural context of its appearance and use. Which is not exactly just to say there’s nothing new under the sun, but, yet, there’s nothing new under the sun, indeed.

Philistinism

The above makes me think of three related things. First, insofar as it exists as a distinct discipline, game studies is also an isolated discipline, one without common methods or histories. It remixes different disciplines and forms—sociology, psychology, computing, art history, anthropology, and more—without deeply engaging with any of those domains. Isolation is the price we pay for a distinctiveness that also appears to be a failure, more or less, for reasons I’ll come back to later.

Here’s a nice tweet-sized quip on this theme from Lantz’s response to the responses to his original Notes on Formalism:

People who complain about Dear Esther et al aren’t formalists they’re philistines. Were the ppl who walked out of Rite of Spring formalists?

Right. But here’s the thing: to talk about methods of creation and critique in games as if the history of art, literature, painting, cinema, and whatever else never happened, but we thought it up all on our own, this is also a kind of philistinism. And basically everyone working in games is guilty of this, because it’s been possible to get away with it. Why? Well, by and large game studies has failed to create the field it laid claim to, not so much for a lack of will or cleverness but because of the changing dynamics of field building in post-2000 media studies, and in particular the tidal wave of destruction wrought after the 2008 global financial collapse. And yet, we created enough of a field to give ourselves a place to converse and debate as if we were full-fledged. There’s much more to say about all this, but I’ll leave it for another time. Suffice it to say: just as games themselves are often seen as a cloistered and lowbrow domain, so game studies too could be seen in the same light.

Fundamentalism

The problem with replacing formalism with form is that it doesn’t seem to satisfy anyone much. If we could just agree that there are lots of materials in operation—not just physical materials, mind you, but also intangible and cultural ones—and that games enact their many forms in various ways, then perhaps everyone would be satisfied. But that’s clearly not the case.

One reason is that everyone tends to have their favorite form. There are signals for this in scholarly discourse. “The real question is…” or “What X fails to take into account…” or “But unconsidered is the role of Y…” and so forth. This isn’t just a problem in games; it’s everywhere. We are reared in a discipline and a method and our brains ossify.

But there’s also rhetorical value in doing so. Take Stephen Beirne’s explanation of his term for the supposed game formalism that is our overall subject, ludo-fundamentalism:

Ludocentricism suggests a method of looking at games that centres on their ludic parts, their ‘game’ parts.… Ludo-fundamentalism, on the other hand, connotes to me an ideological current that inflates the importance of ludic parts at the cost of non-ludic parts. It speaks of values, recited into customs (as designs); it is prescriptive of the medium through the ideas it proposes. It advances rhetoric that diminishes the role of non-ludic parts in the composition of a videogame.

You can already see why I think Sontag-via-Keough’s form is useful here. If you simply declare that there are no non-ludic parts of the composition of a videogame, but rather that the ludic describes the entire domain of games as an act of medium-specific disciplinary claim-staking, then our eyebrows should raise every time an exclusionary gesture rears its head. Right?

Except that’s not what happens. Instead, someone (I’m not picking on Beirne here, by the way, I’m just playing out the fundamentalist logic) objects to the fact that the form being observed is in fact itself inferior or secondary to some other, more preferable form. It’s pretty easy to do:

How can you talk about computer game design or play as an abstract practice separate from an understanding of the hardware and software systems that facilitate it? How can you talk about hardware and software systems separate from the commercial processes of production that create them? How can you talk about commercial production processes separate from political economy and global capitalism?

See, in its best, most generous incarnation questions like this would invoke that messy coupling of Law’s. But in practice, these moves strive to undermine one mode in terms of another—usually the favorite discourse of the interlocutor. Again, “the real question is…” This is the standard mode of the academic conference Q&A session, in which every question amounts to something like, “But how does this relate to what I’m interested in?” And round and round we go.

It’s not all bad, by the way. Part of the reason for a field of discourse to exist is to reproduce itself via controversy and dispute. Often we have to over-produce, partly to hide the fact that resolution would give us all nothing to do. And given what I’ve already said (and will continue saying, in a moment) about game studies being on the rocks anyway, maybe another ludology/narratology “debate” (or two or three) would serve us well.

Resolutionism

In fact, it might be worse to pretend that we agree on the right, best, most pleasurable, or most aesthetically redeeming aspects of games (or anything) rather than to acknowledge that real differences in motivation, aesthetics, and political concern are at work. There are (at least) two ways to make this resolutionist error.

The first is to assume that all disagreement arises from wickedness rather than preference, tactics, familiarity, or any number of other modes, and that the resolution of said vice is imperative on moral grounds. Here’s a very gentle but still clearly resolutionist call to arms of this stripe from Austin Walker:

When people talk about games, we argued, everyone seemed to prize the interactive bits to the cost of everything else: aesthetics, narrative, music. This cultural formalism, they argued, was detrimental—and even politically problematic. We saw this hegemonic formalism at work whenever a Twine game or “walking simulator” was called a non-game, or whenever GamerGaters (and others) deployed words like “fun” and “escapism” as shields meant to deflect criticism of works lacking diversity (or filled with racism and sexism.)

Nobody wants to be accused of being part of the hegemon, and nobody (well, nobody reading this post anyway) wants to be aligned with GamerGaters. And sure, there are interlocutors who are dismissive in a manner that demands critique or even scorn. But that doesn’t make the very idea of such critiques detrimental or problematic, unless the purpose of the objection is to reframe the conversation around the my-favorite-formalism just mentioned. It also doesn’t mean the two “sides” must or even can find reconciliation! History is full of legitimate, unresolved intellectual and aesthetic disputes.

An example might be found in those who see Twine as a new means to facilitate non-traditional creators’ voices in games, and those who see Twine as an unexpected resurgence of hypertext fiction for the web. For the former group, the association with a prior (and largely white and male) tradition exerts an unwelcome colonizing force that undermines the liberationist possibilities of the platform and its practitioners. For the latter, the refusal to acknowledge said lineage signals a blinkered ahistoricism that, in refusing to answer for the shift from (e-)literature to games, posits a cultural and aesthetic move for which it has no theory or justification—a situation that might even undermine its ultimate mission. It’s possible that this conflict cannot be reconciled, at least not in the present. Would that really be such a calamity? Does it not signal the unexpected richness and intrigue of the topic, rather than suggest that one “side” is righteous and the other wicked?

Another, different incompatibility arises when we read Walker’s concern against the broader context of the media ecosystem writ large. The idea that “everyone seemed to prize the interactive bits to the cost of everything else: aesthetics [graphics, presumably –ib], narrative, music” would strike anyone on the periphery of games as a radical outlier of a position. Media today is overwhelmed by words, images, stories, and sounds. As I argued recently, today’s media ecosystem seems to preclude the ascendency of games, if games entail “interactive bits” or procedural systems instead of text, image, and moving image. Yet, within the world of games, those modes can be successfully recast as underdogs, thanks to, well, ludologism.

The second version of resolutionism is the affirmative one, the one that assumes that the mess of games implies that all specimens are equally valid and desirable under most circumstances. The Errant Signal video offers a helpful example of this take (I was transcribing, this may not be verbatim):

Games are Elegy for a Dead World, with its focus on free-form writing prose and rejection of mechanical interaction. And games are Geometry Wars, with its focus on abstract, mechanical precision and no real story. And games are everything in-between. We have to stop with this obsession of putting games and story in opposition to one another. We have to look at what each game is trying to do on its own terms, instead of making broad, sweeping generalizations about how all games should work.

Remember that this video specifically addresses ludology vs. narratology, thus the specific mention of games and story. No matter, the message is clear: can’t we all get along? Some games are great for their story, and some for their gameplay. And some for their [insert other thing], we might add. Who can argue that treating a work of art on its own terms is a bad idea?

But even if we embrace the can’t we all get along premise in principle, it inevitably breaks down in practice.

For one part, taste and aesthetics come into play. It must be possible to take a deliberate position on the virtue or vice of a specific artistic approach. Indeed, a work’s or a genre’s very existence depends on such an ecosystem of adherents and detractors, fans and critics. For Twine to be the voice of a new, diverse generation of game-inspired creators, perhaps it must also be a rehash of hypertext fiction. The one position doesn’t take away from the other; it helps reinforce and bolster it by proving that there is an alternative.

And for another part, taking individual specimens on their own terms must not also entail the blind acceptance and approval of every work on the mere basis of the fact of its existence. “There is always a prescriptive dimension to aesthetic arguments,” Lantz observes. Since the video used it as an example, I’ll reveal, perhaps unwisely, that Elegy for a Dead World strikes me as a preposterous parody of free-form writing, a coloring-book of a work that disgorges and spits out the craft and process of writing, getting away with it only because, apparently, gamers don’t read or write enough to know the difference. This is an immoderate, snobbish position that assumes an equally immoderate, snobbish take on fan-fiction, of which I’m also guilty. But it doesn’t make Elegy for a Dead World any less deserving of deep and prolonged attention!

(Aside: this need to love one’s object of study, such that criticism entails constant celebration, remains a longstanding bugbear of mine.)

Underdog Studies

Todd Harper meditated on the role power and social capital have to play in discourses like the ones linked above. The real purpose of these conversations, says Harper, isn’t to answer questions about the ontology of games. Instead, “It’s a push-pull of power and capital, primarily social capital. It’s the right to have one’s voice considered ‘legitimate.’”

Harper uses both Lantz and myself as examples of figures profoundly rich in social capital, an undeniable fact—within a certain context. But I’ll offer a corollary observation that responds to another point of Harpers:

Being in a tenured professorship—or even a non-tenured one—carries a degree of social capital simply by existing. If you’re at a big name university, it’s enhanced. Frank Lantz works at New York University, and just having its name on your business card opens doors for you. I know this to be true personally, as I was very fortunate to spend four years as an employee of MIT. That name opens doors for you because it’s got a history of respect behind it.

All of this is true. What a name doesn’t do, however, is make things much easier on the inside. Harper should know that well, in fact, given that the group he worked for at MIT was shuttered when its funding ended, even though MIT could easily have kept it running if they’d desired.

The truth is, as a critical and pedagogical concern, game studies is hardly a powerful actor. Games are, I’m sorry to report, a joke that have managed nevertheless to eke out a place in the study of arts and culture. Within NYU’s Tisch School of the Arts, where Frank’s program lives, games hardly rise to the register of cultural respect of film or painting or theater dance, to name some of the domains operated under that school’s shingle. Tisch’s faculty, for the record, are not tenured, but hired as “arts professors,” a longstanding method for creating “flexibility” in a marketplace rich with artists who would happily take a job more stable than “artist” in New York City.

Meanwhile, at the USC School of Cinematic Arts, Tracy Fullerton has labored for years—and sometimes in spite her program’s obvious success—to gain a place for games in what is probably the world’s most famous film school. As for me, more than a decade on, Georgia Tech still struggles to understand media as compared to technology. And even the mighty social welfare states in the Nordic countries, where game studies first flourished, are encountering troubles. At ITU Copenhagen, for example, a new focus on professionalization and enrollments, bolstered by complex issues of linguistic nationalism, have been partly responsible for casting game studies to the wind at that institution.

Not quite fifteen years after Espen Aarseth declared Computer Game Studies, Year One (the title to which this post alludes), ours is an improbable, fledgling discipline whose future is hardly secure. It’s possible we’ve all made an error in isolating any media form from its kindred, particularly in the post-2008 era of austerity, where perhaps the only way for media studies to flourish is by teaming up, Voltron-style, and finally realizing that the overall project of making and critiquing media in culture needs a strong foundation atop which to develop medium-specific theories and approaches. And likewise, that isolating one medium from another—literature from games, games from toasters, etc.—might implicitly endorse rather than diffuse philistinism.

Or else, or in addition: we need more and greater dispute, such that the terms and principles of various schools of thought are clearly identifiable, associated with specific individuals and institutions, clearly namable for invocation, and receptive to invocation in critical and design contexts. Perhaps the worst hegemony is the hegemony of simplistic, linear progress, the hegemony of thinking that we all really do have something in common, that there is some clear and certain ruleset for intellectual discourse online- and off that maximizes progress or justice or what have you—and that we ought to reconcile and resolve that commonality, boil it down to the average of its various components, and sip this decoction together, the delightful broth of games. Somehow, in the long run, I doubt that’s a plan for which anyone will be reproached, let alone praised.