By

You probably remember a grade school teacher who seemed to have eyes at the back of her head. Somebody who could walk into an unruly classroom and with just a look, quell the disorder and get everybody back into their seats. When such a teacher enters a classroom, any mischief underway is abandoned instantly. Those caught in the teacher’s direct gaze freeze or try to scramble back to their seats. Those who think they are in peripheral vision try to duck and hide. Those who believe they haven’t been seen try to flee.

This sort of teacher possesses an authoritarian eye: a way of seeing shared by certain sorts of effective teachers, drill sergeants, sports coaches and the sorts of large organizations that James Scott explored in Seeing Like a State.

The classroom example illustrates something important. Authority and responses to it are primarily about seeing and being seen, rather than doing or having things done to you.

When you know you’re being watched by an authoritarian eye, you voluntarily behave in simpler (or equivalently, more orderly) ways than when you know you aren’t.

The difference between the two regimes of behavior is social dark matter. And in today’s digital social environments, it is starting to behave in ways we don’t really understand. Because we feel watched in ways we don’t really understand, by forms of authority we have never experienced before.

The Video Version

Before we proceed, here’s a little movie about the ideas in this post. It’s about three minutes long.

The themes I am here are necessarily a bit abstract, and I’ve been using a little simulation model to explore them. This movie was created using that model. It’s a poignant and somewhat campy bit of almost-abstract expressionism, if I do say so myself.

Now let’s get back to business as usual. Words and such. Computer models can only take us so far.

Seeing and Being Seen, Digitally

I first started thinking in terms of dark matter a few months ago after reading a post by Toph Tucker that argued that a significant amount of social activity on Facebook was starting to retreat to the privacy of secret (unlisted) groups. Tucker argued that activity in these groups accounted for the discrepancy between the continued robust traffic growth of Facebook on the one hand, and industry watchers’ opinions that Facebook had somehow peaked. As he put it:

There is a discrepancy in the apparent pull exhibited by Facebook. Among industry-watchers, 30-somethings, preteens, and social media cynics of all stripes, there’s a strong narrative that Facebook is already passé.

Secret groups on Facebook, Tucker argues, account for the discrepancy and should be thought of as Facebook’s “Dark Matter.” Such groups are rising in popularity (I am part of several) and for many Facebook users, activity within secret groups dwarfs activity that is more broadly visible.

I think this argument is essentially correct, though the specifics might change as digital communities evolve. I think Tucker’s notion of Facebook dark matter is worth generalizing to all social realities, whether physical, digital, or hybrid.

So that is what I am going to try and do here.

The Participatory Panopticon Chases its Own Tail

The dark matter idea complicates a related idea: James Cascio’s notion of the participatory panopticon (basically a situation of “everybody watching everybody else, all the time”, where the authoritarian eye is at least partly the eye of the collective itself, operating alongside various algorithmic Big Data eyes).

The metaphor of dark matter suggests an interesting situation where the collective is trying to retreat from its own omniscient gaze as much as it is trying to retreat from the Big Advertising Eye.

Managing the trade-off between seeing and being seen is something we’re naturally wired to do. Every time you pick a table at a restaurant, you are relying on ancient instincts that have evolved around this critical everyday decision-making behavior. Some of my favorite social science research, by Rachel and Stephen Kaplan, concerns these instincts. Turns out humans choose the same sort of complex but legible environment when presented with sets of photographs of both natural and urban settings.

But it is not clear how these instincts play out when our sensory cues involve a scrolling stream of mixed media updates on a small screen, and our minds model the situation as some ambiguous mix of “community”, “graph” and “algorithmic Big Brother.”

The last element is of course, the newest one, and the only reason we aren’t in more of a panic about it is the banality of its primary association with advertising (but then, authority in action is usually banal to watch).

We have some useful instincts around “community” that port to the digital realm. Thinking about matters of kinship and genealogy has prepared us at least a little for navigating globe-spanning social graphs. But an algorithmic Big Brother is a new element in the environment.

Unfamiliar Controls

We have far more control over seeing and being seen digitally than the rather alarmist notion of a participatory panopticon suggests, but the problem is that these modes of control are deeply unfamiliar to us: avatars, anonymity, degrees of separation on a “graph”, complex privacy settings, augmented reality glasses.

These are not modes that work well with highly evolved subconscious instincts that map seeing/being-seen decisions to choosing physical locations in highly visual environments. Trying to apply existing instincts is like suddenly trying to use existing language skills to operate in Braille.

The situation is similar to the early twentieth century when two powerful new technologies, the automobile and the airplane, became available, but most people did not immediately acquire driving or flying skills. For several decades, most people remained primarily passengers, at best exercising control by ringing a bell to request a stop while on a bus. Driving eventually became a commonplace skill in the developed world, but flying never did (though it might now, if everybody gets a drone).

For the Internet, browsing is like riding a bus and occasionally getting off one bus and getting onto another. Content creation is like driving. Few do it now, but it should become as ubiquitous in a few decades as driving today. I suspect that programming, like flying, will remain a minority skill for quite a while.

Browsing, content creation and programming represent different levels of autonomous control over seeing and being seen digitally. But acquiring those skills is not the same thing as knowing what to do with them. To get there, we need to understand more about the enactment of authority.

The Enactment of Authority

That we behave differently depending on who we think is watching is a fairly trite observation. When a child is watching, we make funny faces. When somebody we find attractive is watching, we preen and posture. Voyeurs make sure they can see without being seen. Pick-up artists make sure they can both see and be seen.

What makes the gaze of authority special is that the watched voluntarily simplify and order their own behavior to prepare to act on the desires of the watcher.

This is the reason authority induces power. Just by looking, it turns what it sees into a ready and waiting instrument capable of enacting its intentions within a space of desires. Authoritarian seeing is like a magnetic field acting on a domain of free agency.

It is useful to understand authority as an enacted process rather than an attribute of a system. We can represent the authority process in general terms as this imaginary sequence of commands:

“Stop whatever you are doing” (abandon autonomous intentions that might compete with mine) “Fall in” (enter a waiting state, prepare to do my bidding) “Form a single file” (conform to a known, ordered state) “Do X” (execute my intention)

The important thing about this sequence is that when authority is real, the first three steps often happen without any explicit instructions or cues beyond awareness of being the gaze of the authoritarian eye.

In fact, the first two steps happen naturally and require no training or special conditioning. The eye only needs to be perceived as an authoritarian eye. The third step — getting into a known, ordered state — is context-dependent and might require more or less training.

This is where design enters the picture.

Authority and Design

The idea that an authoritarian gaze might naturally precipitate a simple ordering in what it sees suggests something interesting: design may be an optional extra in the enactment of authority.

James Scott’s association of authoritarian ordering of reality with a specific aesthetic — high modernism inspired by scientism — is not always necessary. The ordering effects of authority existed before Le Corbusier came along with his platonic visual ordering schemes that attempted to turn human communities into imperial apiaries. They existed before reformist authoritarian forces co-opted science and mathematics into the rhetoric of authoritarianism.

The authoritarian eye can cause the precipitation of order without recourse to environmental or instructional design. The threat of violence is sufficient. As in the programming of computers, simple and ordered states are better starting points for action with predictable consequences. But the first two steps of the authority process may suffice in many situations, to create enough order for authority to be exercised in highly leveraged and predictable ways.

You do not need to imagine, specify and impose something like a platonic design such as a military formation. Authority works in simpler ways. Alpha chimpanzees do not need to train the troop to wear uniforms and march in military order to exercise authority. A disorder-quelling look will do (and remember, “disorder” includes any autonomous activity that the authoritarian eye does not care to understand).

For a long time, this idea — that authority is more about seeing than doing — eluded me. Accounts of the authority process such as Scott’s emphasize the consequences of positive action on the part of the authoritarian force. Simpler versions of the idea, such as the parable of Chesterton’s Fence, also focus on what the authority does or does not do.

I think we’ve gotten it backwards.

The Primacy of Authoritarian Seeing

I am convinced authoritarian design is merely the cherry on top. At least in dealing with social realities, what is more important is what the authority sees or does not see.

Why? Because authority exercised through direct coercive action is inefficient to the point of being useless beyond a certain scale. But authority expressed and exercised through the authoritarian eye is nearly infinitely scalable. The source of this leverage is of course the fact that humans (and agents in general), unlike non-sentient matter, can recognize and respond to being seen.

So action can be restricted to the rare instance of “making an example” of an unfortunate victim. Such action actually reduces the effectiveness of the authoritarian eye beyond a point.

The condition of a social system that has submitted to authority is a sort of self-reinforcing, self-perpetuating collective learned helplessness. It has to be. Otherwise the authoritarian eye would not be worth acquiring, with or without imposing cathedrals and plazas for support.

We pay so much attention to positive authoritarian actions because their consequences are so disproportionately visible (high modernist city plazas, grand cathedrals, UI design decisions by Facebook employees, and so forth), and because we tend to attribute all the effects of authority to those actions.

The bulk of the dynamics of authority, however, lie in the voluntary actions precipitated by the authoritarian gaze. In particular, the self-ordering and self-simplification on the part of those caught in it.

If you stop whatever you are doing and pay attention to authority, much of the authority has already been exercised. You are already useful and usable in your waiting state.

For the authoritarian school teacher, who has yelled “Sit DOWN and SHUT UP!” a few times, like the bus driver on South Park, this ordered state is relatively simple to precipitate with just a look. For a military unit, this ordered state represents a far greater degree of discipline and potential for predictable action.

But in both cases, potential for predictable action– a waiting state — has been created by a look.

But what happens when the authoritarian eye is not looking somewhere?

The Emperor, Incognito

The authority process is the source of both the power and the weakness of authority.

The authoritarian eye creates the known, ordered state that turns the subject of authority into an instrument for the expression of authoritarian intentions. This is power. The leveraged potential created by the threat of force.

But in inducing this ordered state, the authoritarian eye also blinds itself. How does that happen?

Let’s go back to the example of the teacher walking into a classroom that is up to some mischief. Consider the four basic responses.

If you are front-and-center, freeze If you’ve obviously been seen, scramble back to your desk If you think you might not have been seen, hide behind a desk If you’re sure you haven’t been seen, run

All four behaviors are responses to varying probabilities of having been seen in a state that might attract punishment. But instead of the wild behavioral response — fight-or-flight — we get the modified pair of behaviors characteristic of social species that operate under in-group authoritarian threat: submit-or-retreat (exit-or-voice is a more refined descendant).

But in all four cases, the very act of authoritarian seeing removes useful information from what is being seen. There is a reason we have the trope of emperors sneaking out of the palace and walking among the people as commoners. If the authoritarian eye actually wants to know what is going on, it cannot be seen to be watching.

But when the emperor cannot wander his realm incognito, a fundamental trade-off exists for any agent that seeks to exercise authority over a social system between power on the one hand, and knowledge on the other.

The Power-Knowledge Trade-off

When a social system enters an ordered state under an authoritarian gaze, it loses information, like a hard-disk being wiped clean. If the authoritarian gaze persists long enough — a human generation say — the information loss can be permanent.

This is the problem Scott identified as the loss of tacit knowledge, metis, under an authoritarian gaze.

In our running example of the authoritarian teacher in a mischievous classroom, consider the state of free play that is disrupted and transformed into a state of ordered readiness for command.

The consequence that the teacher wants — stamping out of any brewing dissent and rebellion that might make the class ungovernable in the future — is achieved. But what is also stamped out in the process is anything the children might be learning or figuring out in that state of free play.

What makes the authoritarian gaze a net lossy gaze in an objective sense is that some of that information might actually have been valuable to the authority itself.

Submission, Civilization and Retreat

Humans (and other social species) have a category of responses specific to in-group threats of violence from authority: submission.

As a first approximation, we can say that submissive behaviors are also nascent civilized behaviors.

The first two of the four responses I listed represent submissive social behaviors under an authoritarian gaze. They can range from fearful to proactively compliant or even eager.

The last two represent retreat behaviors. Depending on how quickly and unpredictably the eye moves and how much it sees, retreat behaviors may represent more or less of the response relative to submission behaviors.

In one extreme case, the eye may see so much, and move so fast and unpredictably, it freezes everything that is going on and sees an accurate snapshot of social dark matter in motion.

In other cases, it may see so little and move so slowly and predictably, that all agency (not just information) retreats away from the path of its gaze, leaving it staring at emptiness wherever it looks. This is authority weakened to uselessness (and the basis of the common trope in action movies involving heroes among ninjas or in dark forests).

So to summarize, the authoritarian gaze changes reality, simply by looking at it, in ways that both strengthen and weaken its position and potential. The part of reality that submits to the gaze represents power. The part that retreats from view represents weakness and potential challenges to authority.

Now here is the interesting thing: by forcing such a division between a space of retreat and a space of submission even when the natural behaviors are harmless or potentially useful to it, the authoritarian eye creates its own enemies and plants the seeds of its own destruction.

In the classroom example, if the teacher is known as a tyrant who expects perfect, orderly stillness when she enters the room, retreat is an act that criminalizes the potentially non-criminal.

Say ten children are animatedly discussing an exciting homework assignment, but when the teacher enters, one is too far from his desk to run back, he tries to duck out and run.

The very act of running makes him a criminal.

The ordering effect of the authoritarian eye carries with it an implicit negative judgement of retreat: if you have nothing to hide, why did you run?

Questions to Ponder

I’ve been battling the flu and wrangling tax and book-keeping issues for the last week (scurrying about under the authoritarian death-and-taxes eye so to speak) so I didn’t have time to cover everything I wanted in this post. So this is probably going to continue in some form. I don’t like to commit to a post series, given my record of going off the rails whenever I try, but we’ll revisit this stuff. Probably.

For now, I’ll leave you with a dozen important and obvious questions that I did not address:

What is the difference between social dark matter in retreat, versus social dark matter that is merely unobserved? What changes when we substitute a participatory panopticon for an external authoritarian eye? In the movie, I’ve modeled the simpler case of an external panopticon. Clearly the “spotlight” of the gaze of the authoritarian eye would get coupled with the state of the “population” in the participatory case. How does that work? How does a moving technological frontier change things? Does social dark matter continually retreat into an open expanse, chased by an advancing authoritarian eye? Or does it eventually get confined to nooks and crannies too unprofitable for the eye to peer into? How does the emperor wandering incognito differ from a “half the population is spying on the other half” police state (a state that is at least superficially very similar to the modern emerging structure of digital reality)? Can the authority-knowledge trade-off be broken? Or at least managed in a different way? (I believe it can; you just have to give up predictability, but you can retain the authority itself). Is the algorithmic Big Data eye a locus of authoritarian agency in its own right, or does it merely redistribute agency among the participants in the panopticon based on their skills and resources (which would make rich data scientists pretty powerful)? How does all this interact with social interaction design — the warrens-and-plazas ideas from Xianhang Zhang that I explored a while back in relation to legibility. How does active rebellion work? What if the dark matter, instead of merely submitting or retreating, acts to subvert the authoritarian eye? Is retreat necessary for rebellion? Can the authoritarian eye be hacked by what it sees in plain sight? Search engine optimization is an obvious example. Does this generalize? Conversely, can we design an Eye that causes useful information to flow into its gaze voluntarily, rather than retreat? (I believe so, but it comes at a price). Do digital realities render pre-digital modes of dissent and retreat meaningless? Are things like Bitcoin really subversive in a traditional sense? In the movie, I put in some traditional anti-authoritarian rhetoric as a joke. It is not clear to me that digital politics will be the same as non-digital politics. I’ve been tossing around a placeholder notion of centralized artificial digital agency (Big Data, Big Algorithm, etc.) People with dramatic imaginations usually go straight to Skynet and the Singularity, but I think what is actually emerging is a very different beast. What is the nature of this beast? What, if anything, does it want? Does it merely precipitate a useless, ordered waiting state wherever it looks, that does nothing for anyone? (that’s my null hypothesis)

We’ll see where we get with these questions if and when I get around to them. Don’t hold your breath though.