Back when this article first began coming together, a telling story appeared among the sensationalist reports of the British tabloid papers. A 89-year-old retired art teacher and former Royal Navy electrician, named only as Anne, retired to the Dignitas clinic in Switzerland in order to end her life alongside others seeking less-restrictive assisted suicide laws than in their country of origin. Nothing remarkable in itself. What was more noticeable was her comments about what had led her there; namely that she could not keep up with technological-industrial society and found the world as it is today unnavigable and unbearable. “Why do so many people spend their lives sitting in front of a computer or television?” she asked in the feature. “People are becoming more and more remote. We are becoming robots. It is this lack of humanity.”

No-one on these islands could be confused as to what Annie might be speaking of in these statements. Whether you consider it an exciting advance or perhaps even a necessary evil, it is indisputable that in the “developed” world these days there are few places to find refuge from the many faces of the screen; and, more specifically, from the networks that now bind together these devices and more. And not just in the sphere of communications media as we have previously understood it as limited to, nor to the workplace or home – from airports, country trails, churches, places of organised leisure, the web of signals and interfaces has spread, rather like a virus, throughout almost all corners of the cultures it emerged from or has colonised subsequently.

These days it's rare to attend a concert where the front row is made up of attentive faces rather than those bathed in the glow behind the camera-phone lens, eagerly consuming the performance through a secondary medium or even perhaps absently recording to peruse at a later date, with no remaining need to be “in the moment” to be able to exchange opinions with our friends about what was truly the highlight of the night. Indeed, often it feels as if the event itself (whatever it may be) is of secondary importance to the flurry of digital activity that crowds around it; from the social media promotion beforehand to the online reviews appearing simultaneously with the evening's running order taking its course. “The most obvious use of Twitter,” according Eric Schmidt while CEO of Google, is in situations where “everybody is watching a play and are busy talking about the play while the play is underway.” Meanwhile, to text message your neighbours instead of dropping around unannounced has become entirely reasonable (finding acceptance even among age-groups who would previously have balked at the idea), more appropriate, more... neighbourly. Computer games, previously thought by some to be found among the lower reaches of detachment from the social realm, have now been ousted from that scale by new depths: watching other people playing computer games becoming a mass spectator sport.

The writer Daniel Goleman gives us a familiar anecdote. “The little girl's head only came up to her mother's waist as she hugged her mum, and held on fiercely as they rode a ferry to a holiday island. The mother, though, didn't respond to her, or even seem to notice: she was absorbed in her iPad all the while.

There was a reprise a few minutes later, as I was getting into a shared taxi van with nine female students who that night were journeying to a weekend getaway. Within a minute of taking their seats in the dark van, dim lights flicked on as every one of the women checked an iPhone or tablet. Desultory conversations sputtered along while they texted or scrolled through Facebook. But mostly there was silence.

The indifference of that mother, and the silence among the students, are symptoms of how technology captures our attention and disrupts our connections. In 2006, the word 'pizzled' entered our lexicon; a combination of puzzled and pissed, it captured the feeling people had when the person they were with whipped out their BlackBerry [mid-conversation] and started talking to someone else. Back then people felt hurt and indignant in such moments.

Today it's the norm.” Sociological literature has labelled an instance of such a behaviour an 'away' – a gesture which tells another person “I'm not interested in what's going on here and now”, now epidemic in a saturated media environment of continuous partial attention, from the boardroom to the living room. The new digital era is becoming so normalised in the minds of its participants that people born directly into the tech-boom of the 1980's and '90s onward can barely imagine the world another way – and yet there are many who remember a life less clustered by gadgets and some still of them who have not submitted to their embrace. “They say adapt or die. At my age,” stated Annie, “I feel I can’t adapt, because the new age is not an age that I grew up to understand.” That it is probably so easy to write off the complaints of an aged woman and her generation speaks of the callousness that has become so commonplace in industrial society towards its 'spent resources', as age-old respect for and wisdom from elders (that is, those deemed to have earned the title) becomes the scorn of the tech-literate towards the dismay of many of our predecessors at the dizzying pace of techno-acceleration, in a deskilled society less guided by attained and lived human wisdom than externally-implemented machine updates. The assumption is that it is they, as well as their more familiar technologies, that are 'obsolete' – without a place, without a future.

Yet these observations could elicit the retort that what's at issue is simply mis- or over-use of the options that the digital medium are aligned towards. The tool is what we make of it, we tell ourselves. Here we encounter a classic trap in analysing a technology: focusing on the content (i.e. what information, stories, arguments etc. are conveyed, or what task performed) at the expense of examining the form (i.e. what the physical medium entails) to work out how it influences how we think, feel and act. How in control of the affects of the digital medium are we by choosing what we access through it? Or what, in itself, goes with the territory?

Each technology carries within it a reflection of the ideology that it was crafted in the context of. What we are experiencing at the moment is a change that is maybe similar in scale and depth to that which heralded the industrial revolution; a paradigm shift in the way that we encounter the world, born from the productivist and capitalising mentality and yet perhaps distinct in many ways from the previous era in terms of how we are conditioned to operate by the tools we use. Some have called this the 'interface revolution'. At the centre of this, reaching even to a physiological level, is the internet. Before moving on to what this might mean for those of the anarchist space (or others) in search of a way out of the dominant culture, we would do well to examine these shifts. In much of the world the Net is no longer felt to be a distinct destination we access in a specific moment through a designated technology, but rather an environment we inhabit permanently, always on, always present, always transmitting and receiving; and despite the degree to which we almost accept it as a part of ourselves, to recall facts or retain social ties, one which simultaneously seems to fade into the background of many people's awareness.

The Message & The Medium

“I can feel it too. Over the last few years I've had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping my neural circuitry, reprogramming the memory. My mind isn't going – so far as I can tell – but it's changing. I'm not thinking the way I used to think. I feel it most strongly when I'm reading. I used to find it easy to immerse myself in a book or a lengthy article. My mind would get caught up in the twists of the narrative or the turns of the argument, and I'd spend hours strolling through long stretches of prose. That's rarely the case anymore. Now my concentration starts to drift after a page or two. I get fidgety, lose the thread, begin looking for something else to do. […] Whether I'm online or not, my mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a jet ski. [M]y brain, I realized, wasn't just drifting. It was hungry. It was demanding to be fed the way the Net fed it – and the more it was fed, the hungrier it became. Even when I was away from the computer, I yearned to check e-mail, click links, do some Googling. I wanted to be connected.” – Nicholas Carr

Until relatively recently, for centuries the dominant Western culture has operated under a prevailing model of linearity, as can be seen in the development of literacy for example: reading meaning pursuing a single body of text, with a priority on contemplation, solitude (in at least a mental sense), and attentiveness. The form which the internet takes, with the simple leaf of a book replaced by the scramble of toolbars, links, hypertext, advertising, automatically-streaming video and so on, is cultivating a shift into a non-linear realm. Today we who are immersed in the online world often don't necessarily read left to right or top to bottom anymore, but skim around the page trying to pick out titbits of 'key' information rather than try to absorb the piece as a whole. It's no secret that by and large the media industries consider that “print is dead”, and the cultural direction is towards any and all publication eventually being virtual. Some researchers have claimed that their studies in topics such as subject, composition and narrative flow show creative writing to have steadily become less imaginative and diverse over the last decades, whereas graphic art for instance has shown an opposite trend as culture becomes even more spectacular and symbol-manipulating.

Do you remember how you feel when you come away from any prolonged time on the internet? How it feels like you struggle to 'readjust' to the elements of our daily life which remain non-digitalised? Is there even much space between these moments for you anymore, fluttering between phonescreen, tablet, desktop? We could consider the scientific narrative which has come to the fore among neurologists (those who study the brain) about “neuro-plasticity”, as one potential story to consider among others in theorising our situation (obviously with an eye to the limitations, framings and biases inherent in its scientific tradition). Nicholas Carr quotes such a scientist, Michael Merzenich, who “ruminated on the Internet's power to cause not just modest alterations, but fundamental changes in our mental makeup. Noting that “our brain is modified on a substantial scale, physically and functionally, each time we learn a new skill or develop a new ability,” he described the Net as the latest in a series of “modern cultural specializations” that “contemporary humans can spend millions of 'practice' events at [and that] the average human a thousand years ago had absolutely no exposure to.” He concluded that “our brains are massively remodeled by this exposure.” He returned to this theme in a post on his blog in 2008, resorting to capital letters to emphasize his points. “When culture drives changes in the ways that we engage our brains, it creates DIFFERENT brains,” he wrote, noting that our minds “strengthen specific heavily-exercised processes.” While acknowledging that it's now hard to imagine living without the Internet and online tools like the Google search engine, he stressed that “THEIR HEAVY USE HAS NEUROLOGICAL CONSEQUENCES.”

What we're not doing when we're online also has neurological consequences. Just as neurons that fire together wire together, neurons that don't fire together don't wire together. As the time we spend scanning Web pages crowds out the time we spend reading books, as the time we spend exchanging bite-sized text messages crowds out the time we spend composing sentences and paragraphs, as the time we spend hopping across links crowds out the time we devote to quiet reflection and contemplation, the circuits that support those old intellectual functions and pursuits weaken and begin to break apart. The brain recycles the disused neurons and synapses for other, more pressing work. We gain new skills and perspectives but lose old ones. […] Calm, focused, undistracted, the linear mind is being pushed aside by a new kind of mind that wants and needs to take in and dole out information in short, disjointed, often overlapping bursts – the faster, the better. John Battelle, a onetime magazine editor and journalism professor who now runs an online advertising syndicate, has described the intellectual frisson he experiences when skittering across Web pages: “When I am performing bricolage in real time over the course of hours, I am 'feeling' my brain light up, I [am] 'feeling' like I'm getting smarter.” Most of us have experienced similar sensations while online. The feelings are intoxicating – so much so that they can distract us from the Net's deeper cognitive consequences.”

Again, the temptation might be to blame the sheer volume of data which is available to us (the message) for all this – and indeed there's more to be said on this point – yet, again, we can't help but feel that there is something in the form itself (the medium) which pushes in this direction. Would this not be the roboticness, the remoteness to living social contact 'off-screen', which had so distressed Annie? Though in no way terminally ill, she feared ending up in the hospital or the nursing home. Perhaps what left her seeing no way out but a dignified end to a long (and, by her account, proud) life was seeing the world around her slip into delirium faster than herself.

Digital Dementia

“While dementia is a disease that typically plagues the elderly, a new type of cognitive condition is affecting younger individuals in their early 20s and teens – a disorder known as “digital dementia.” Digital dementia is characterized as the deterioration of brain function as a result of the overuse of digital technology, such as computers, smart phones and Internet use in general, Medical Daily reported. This excess use of technology leads to unbalanced brain development, as heavy users are more likely to overdevelop their left brains, leaving their right brains underdeveloped. The left side of the brain is generally associated with rational thought, numerical computation and fact finding, while the right side of the brain is responsible for more creative skills and emotional thoughts. If the right brain remains under developed in the long term, it can lead to the early onset of dementia. "Ten to 15 percent of those with the mild cognitive disorders develop dementia," said psychiatrist Park Ki-Jeong. Common symptoms of digital dementia include memory problems, shortened attention spans and emotional flattening.” – New 'Digital Dementia' Plaguing Young Tech Users

Obviously, it's not as easy as reductionist science [ed. – see ‘A Profound Dis-ease’] would have it to separate one aspect of relative unhealth from another, the “emotional” from the “physical” and so on. But clearly all is not at ease with human well-being in the civilised world, and the symptoms commonly described as “neurological” are increasingly prevalent. One study across the Western world, “focusing on the changing pattern of neurological deaths from 1979 up to 1997, found that dementias were starting 10 years earlier – affecting more people in their 40s and 50s – and that there was a noticeable increase in neurological deaths in people up to the age of 74. [T]he speed and size of the increases in just 20 years points to mainly environmental influences. ” Here in the U.K., new charities have appeared specifically for young sufferers of dementia and Parkinson's Disease, joining those already responding to surging cancer rates .

Incredibly, it wasn't until 2013 that the authors of the DSM, the official psychiatrist's diagnostic manual, considered 'Internet-Use Disorder' enough of a worldly phenomena to warrant locking up into a discrete, individualising diagnosis for that year's edition (complete with the usual standardising 'solutions'). By around that time, others were estimating 5-10% of internet users to be addicted; as in, “unable to control their use”. In South Korea, home to the world's largest population of internet users, addiction has been recognised across age groups as far back as the '90s. It was there that the term 'digital dementia' was coined, designating a deterioration in cognitive abilities that is more commonly seen in people who have suffered a head injury or psychiatric illness. South Korean doctors have since reported a surge among young people who have become so reliant on electronic devices that they can no longer remember everyday details like their phone numbers. By the time the DSM had published their diagnosis, the amount of people aged 10-19 who use their smartphones for more than seven hours every day was close to 20%, with children more likely than adults to suffer “emotional underdevelopment” because their brains are still growing.

In Korea, as in other Asian countries such as Taiwan, addiction among the young to gaming, social media and virtual realities is recognised as a national health crisis. But from where we are, you needn't travel that far to see the withdrawal symptoms of nervousness, anguish and irritability when kids (and not only) are separated from their devices. As the age-range of “digital natives” grows, their maladies become more recognisable and widespread.

Generation App

“[Howard Gardner and Katie Davis explore] how young people view themselves and their relationships when smart devices are nearly ubiquitous, social rites happen via text message and the currency of popularity is traded in likes and comments on social-sharing apps. […] Gardner and Davis ask whether modern social networks are larger yet shallower than those of their parents and grandparents[...] The app mindset, they say, motivates youth to seek direct, quick, easy solutions – the kinds of answers an app would provide – and to shy away from questions, whether large or small, when there’s no “app for that.” […] But the external polish often hides deep-seated anxiety, outwardly expressed as a need for approval. In their conversations with camp counselors and teachers, Gardner and Davis were repeatedly told that youth today are risk-averse; the app generation, said one focus group participant, is “scared to death.” ” – Is There an App for That?

In Londonderry, Northern Ireland, one primary school has turned to speech and language therapy to try to 'rehabilitate' children three or four years old; who have become dependent on tablets and smartphones. “We find that they are less communicative. They prefer their own company,” reported a teacher. “When we give them blocks to play with you find them using them as pretend iPads or phones.” The therapist herself recounted it as “a general trend throughout the schools I go to. […] Attention, listening and turn-taking are necessary skills and they just don't have them.”

Meanwhile, a sizeable chunk of those who have reached youth or adolescence casually report themselves to be pretty much always online through one device or another (or even several simultaneously). However, a good few also report their disenchantment with this “new normal”. Goleman cites one student who “observes the loneliness and isolation that goes along with living in a virtual world of tweets, status updates and “posting pictures of my dinner”. He notes that his classmates are losing their ability for conversation, let alone the soul-searching discussions that can enrich the college years. And, he says, “no birthday, concert, hang-out session, or party can be enjoyed without taking the time to distance yourself from what you are doing” to make sure that those in your digital world know instantly how much fun you are having.” Many who have interacted with those who have been raised in digital immersion comment on the devastating impact it has had on adventurousness and imagination; how many of today's teens have never been lost (literally or metaphorically), nor seen the point in random walks or other ways of building resilience and independence. By short-cutting the exploratory path to knowledge via discovery, a host of apps and search algorithms diminish engagement with the world and lead to standardised possibilites .

The costs of all this digital engagement surpass the obvious deficit in face-to-face interaction which leaves Generation App unable to pick up on the nuances of non-verbal communication. To return for a moment to the Far East, in some countries there as many as 90% of children are deemed short-sighted (myopic), up from under 20% just a couple of decades before – a significant increase in time spent indoors (and, more than likely, plugged-in) is suspected the cause. In the West, around one person in three is now myopic. A recent survey of children in the U.K. found that a fifth of them didn't play outside at all on an average day, while one in nine hadn't ventured into environments such as parks, forests or beaches for over a year. It was noted that, based on the same study, three-quarters of children in Britain spent less time outside each day than the one hour guideline which the United Nations advises for prisoners [ed. – though, it must be said, this can regularly be denied to inmates in reality]. It's probably unnecessary for us to use up space here detailing all the profound spiritual and psycho-social intelligences undeveloped or engaged with as a consequence [ed. – see ‘The Stories Which Civilisation Holds as Sacred’], besides the more limited “health” ones as commonly recognised.

We could continue at length about the results of this increase in sedentism; diabetes turning from a rare disease into a pandemic in the industrialised world; the links between WiFi signal exposure and cancer, reduced fertility, decreased ability to concentrate, and disturbed sleep ; or the specific deleterious effects of computer-time in general , but for the purposes of this essay we'll now turn to a modern sickness of another kind.

Information Pollution

“ “The pace of life feels morally dangerous to me,” Richard Ford, the novelist, wrote six years ago. It has only gotten worse since then, complains David M. Levy, a victim of information overload who is also a computer scientist at the University of Washington’s Information School. Levy is all but helpless, he says, when new e-mail arrives. He feels obliged to open it. He is similarly hooked on the news, images and nonsense that spill out of the Internet. He is also a receiver and sometimes a transmitter of “surfer’s voice,” the blanched prattling of someone on the phone while diddling around on the Web. “We are living lives of Web fragments,” he said. “We don’t remember that it is part of our birthright as human beings to have space and silence for our thoughts.” [He admits this affects not just him but,] in his view, most of the developed world.” – Information Sickness

It was 1981, long before the internet and the rise of the virtual, never-off, alway-connected world, that the novelist Ted Mooney coined the phrase 'information sickness', and today many of us are not only receivers but often to come degree transmitters of this white noise of data overload. Indeed it has almost become a social expectation in the fast-moving blur of this stage of modernity that we be present in a media environment that more and more becomes 'the environment', that we participate in the never ending conversation about nothing, and respond. The weight of blocks of information hurtling towards us like a Tetris game leaves us too little time simply to reflect on what they really mean, while the constancy of paths these interruptions can take to now reach us (being in most Western consumers back-pockets at all times) scatter our thoughts, weaken our memory, and make us tense and anxious.

To bring us back to the question of the message and its medium; Jerry Mander referred in decades passed to his early stance against the television, continuing his attempt to understand “what was happening to the way that we think and understand information in the television age; our minds were being channeled and simplified to match the channeled and simplified physical environment – suburbs, malls, freeways, high-rise buildings – that also characterized that period (and continues to do so today). This effect would take place, I argued, even if the violence and sex shows and the superficial comedies and the game shows were all removed from the medium, because the process of moving edited images rapidly through a passive human brain was so different from active information gathering, whether from books or newspapers or walks in nature. As a result people would become more passive, less able to deal with nuance and complexity, less able to read or create. People would get “dumber,” and have less understanding of world events even within an exploding information environment.

[…] In our society, speed is celebrated as if it were a virtue in itself. And yet as far as most human beings are concerned, the acceleration of the information cycle has only inundated us with an unprecedented amount of data, most of which is unusable in any practical sense. The true result has been an increase in human anxiety, as we try to keep up with the growing stream of information. Our nervous systems experience the acceleration more than our intellects do. […] As information is moved through different channels its character and its content change; political relationships, concepts, and styles change as well. Even the human spirit and human body change. Because of the way television signals are processed in the brain, thought patterns are altered and a unique, new relationship to information is developed: cerebral, out-of-context, passive.”

Our faculties of memory itself are now significantly shifting to accommodate the online medium. David Brooks commented on it thus: “I had thought that the magic of the information age was that it allowed us to know more, but then I realized the magic of the information age is that it allows us to know less. It provides us with external cognitive servants – silicon memory systems, collaborative online filters, consumer preference algorithms and networked knowledge. We can burden these servants and liberate ourselves.” What he here celebrates as a liberation strikes us more as an evacuation, an emptying-out of our imaginative capabilities and an increased dependence on depersonalised machine inputs. “We are becoming symbiotic with our computer tools,” one research group at Harvard concluded, “growing into interconnected systems that remember less by knowing information than by knowing where the information can be found.” Some, such as Paul Suderman, identify how the Net “teaches us to think like it does,” arguing that “it's no longer terribly efficient to use our brains to store information.” For those of us who consider that encounters with the unknown – and all the tangents, encounters and experiences that follow – to be a vital part of any process of knowledge-constitution, the 'Googlisation' of increasingly precise search results can only speak of another narrowing, another dumbing-down .

The 'human resource' managers and technocrats are often aware of the destabilising effects of this information-overload for the smooth functioning of capitalist labour; hence studies' recommendation for office workers to take time from computer work or diddling around the 'social networks' between tasks to walk in the park; or even just to retire to a quiet room to look at photographs of 'natural settings', to allow the restorative powers the researchers wish to instrumentalise time to work their efficiency-boosting magic. However, it's far from clear that there are many stable mechanisms as yet to dissuade employees in the gigantic factory this society has become from repetitively losing themselves in the endless, mesmerising buzz of the Net; especially when they are conditioned (if not outright expected) to pursue this dependency outside of the traditional workplace.

An aforementioned article uses Levy's perspective to assert that “[i]nformation-polluted people need to organize and protect psychic space and quiet time, Levy believes, much as environmentalists organized in the 1960s to protect wetlands and old-growth forests.” The implication of this statement seems explicit; that the defeat of these 'previous' struggles must be not only acknowledged (which, thus far, of course it must) but also accepted, and the survivors must retreat one trench deeper into anthropocentrism [ed. – see Return Fire vol.2 pg11] to defend something identified as a separable, essential human quality. Yet, outside of this reductionist framework, what is the psychic space formed between a digitally-intoxicated breed of humanity and its relations, not with sun-dappled glades, the flash of the deer or our reflection in the brook, but with the myriad screens it has raised between itself and its world?

Techno-Industrial Enclosure

“Now and in the future, everything must be in its place. Wonder would break a frantically desired monotony, a sorry excuse for life, where the daily humdrum is broken by the ceaseless melodies [ringtones] that resound everywhere (from delirious concerts in non-places like the subway, to the solitary symphonies in the most unexpected places like at night at the top of Stromboli [ed. – a volcanic island in the Tyrrhenian Sea near Sicily]). The desire is to know everything – place, time, activities – in order to cry: I am here, I am there, no problem, no worry, nothing unknown; the buried desire for the unknown is utterly dead, replaced by security. Because waiting is no longer part of this life, capital urgently needs space and time to be occupied; and no squandering is allowed, no elaboration of fantasy is tolerated except that of accumulating more; no misunderstanding, no anticipation lived with passion, determined by desire, sought after in itself for the satisfaction it brings.” – Mobile Prosthesis

Surely, one of the most ruinous elements of the information-age onslaught has been the hobbling of imagination, on a scale dwarfing the process already previously begun by the loss of our story-telling to TV . What we can increasingly expect the psychic space occupied by many people to be was resonant with an experiment relayed to us via Bellamy Fitzpatrick on The Brilliant podcast. “[The researchers] felt that today's youth, specifically the teenagers in the case of this study, are so used to being stimulated all the time, are so used to being on telecommunications, are not used to sitting with their own thoughts (as crazy as that sounds) – and I would definitely say this applies to a lot of people who are older than this as well – and they wondered whether 'kids today', as the saying goes, could sit and entertain themselves with their own imagination. And it was exciting to me because actually they used that specific word. And so there was a study on 68 teenagers between 12 and 18 who voluntarily spent 8 hours alone without access to any telecommunications (so no internet, no phones, no computer, no TV, no radio) and instead what they were allowed to do during this time were other activities like writing, reading, playing musical instruments, painting, needlework, singing, walking and so on. Out of the 68 only 3 were actually able to go the full 8 hours[...] 3 of the participants described themselves as having suicidal thoughts. 5 had panic attacks. 27 experienced symptoms like nausea, sweating, dizziness, hot flushes and abdominal pain; and everyone described themselves as feeling fear and anxiety. Almost all of them bailed by the second or third hour, and only 10 people were able to go 3 hours before experiencing anxiety. And so I think they didn't quite go there in the article that I read, but it seems pretty obvious to me the symptoms that they're describing are those of physical withdrawal, those that we are used to hear being associated with substances like cocaine or heroin...” Indeed, growing numbers of teens are apparently hoaxing symptoms of so-called Attention Deficit Disorder in order to get prescriptions for attention-heightening stimulants to offset the scatterbrain characteristics of their generation, while their parents seek these drugs and those for narcolepsy as routine 'performance-enhancers' to keep up with their jobs.

As we have said, the system's engineers are attentive to these problems, and don't hesitate to encourage their 'resources' to grant themselves the occasional 'digital detox': “[i]nitiatives are blossoming that encourage people to disconnect occasionally (one day per week, for a weekend, a month) in order to take note of their dependence on technological objects and re-experience an “authentic” contact with reality. The attempt proves to be futile of course. The pleasant weekend at the seashore with one’s family and without the smartphones is lived primarily as an experience of disconnection; that is, as something immediately thrown forward to the moment of reconnection, when it will be shared on the Internet” (Google Dégage). At the more lucrative end, users of computer technology are invited to retire to designated 'camps' where, as the arrivals to one such place in California were assured, “the most important status we'll be updating will be our happiness”. Rather than any attempted break with the social paradigm that pushes these technologies as necessary, such efforts generally serve to perpetuate their use by making it more 'sustainable'. The 'detox' is the exceptional time, not the grave effects of intensive digital interfacing, and in the last case the retreat destination sees no need to dispense with the relentless Net-jargon such as the “human-powered search engine” of the camp notice board, or the ominous camp slogan: “Disconnect to Reconnect”, take your break then back to work .

Many, many more will never have even considered such a 'disconnection', as perturbing as it is for many people now in the post-industrial heartlands to even have a short trip suggested without their devices in tow. This shift first became so noticeable within our generation's living memory with the advent of the modern leash, the mobile phone. At the time, the authors of 'Mobile Prothesis' analysed how “[t]his great invention isn’t necessary to support a part of the body, but, if anything, a part of the mind. The mobile or cellular phone (this ill-omened name hits the mark so well), this indispensable tool linked to individuals in such a blatantly unhealthy manner, is not just electromagnetic toxicity, nor just a revolution in interpersonal relationships, nor even just a stupid consumerist gadget that fattens the usual pocketbooks as always.

Above all, it is the replacement of that bit of the unknown that this world still reserves for us, the very small wonders of a sought after solitude, of a journey with oneself, of a time away from known and unknown human beings. The terrifying unknown, inconceivable and unimaginable for those who are afraid of their own life, for those who don’t want to cut themselves off from the cord that links them to the other puppets of this little sham theater even for a moment, for those who want to know and inform others about their life, or more accurately about their own and other people’s physical presence.” Not so many years later, the children of today in many cases have exemplified an acceleration of this trajectory (see the last study mentioned above), and the social trend shows no sign of decreasing . Undoubtedly one of the aggravating factors is the prominence which social networking via online platforms has assumed for even those supposedly on the margins of techno-industrial society. 2005-2008 saw an increase of Facebook users from 5.5 million to 100 million. By the end of 2015, Kevin Tucker recounted that “23% of the entire global population uses Facebook monthly, that’s up from 20.5% at the end of the first quarter of 2015. Short of fire, this is the most widespread and rapidly acquired social change in the history of the human species. That’s fucking insane.” This is far from a uniquely 'First World problem': the Algerian city of Constantine was only one of the more recent from the growing list around the world to open a clinic specifically to counter Facebook addiction, in a country whose users are growing around 10% year-on-year. “In the past,” reflected one writer in 'Points For Further Discussion in the Digital Era', “the idea of abstaining from Friendster or a particular digital social network seemed plausible, to do so simply meant not going on the computer and/or limiting computer use. Computer use largely took place at a specific site, something that we could essentially choose to interact with. In many cases, that is no longer possible. Over the past few years, the Internet has essentially become all pervasive. Through smart phones, the Internet is everywhere. While there are exceptions outside of so-called “industrialized” countries and among those who cannot afford smart phones, for the most part the discussion is more a question of when people will get the capabilities, not if (see for example, all the efforts to get computers to everyone across the world and to enclose the entire world in the web).

This has all had a real impact on how we relate to each other. Seemingly everything is mediated or interrupted by computer-based communication. There are relatively few private moments left, as shown by the numerous studies that track the phenomena known as “sleep texting” or the numbers of people who admit to checking their phones during sex [ed. – cited in one study as 20% of young adults]. The particular studies matter relatively little, what is important is the way in which this activity has more or less been normalized.”

Connecting to our earlier theme it would be a mistake to think of platforms as merely facilitating networking activities; instead, the construction of platforms and social practices is mutually constitutive. After going through the social changes wrought by the shift in Western literacy from the habit of reading out-loud and often communally to the habit of reading silently, Carr went into the direction he was already seeing in online culture. “Now that the context of reading is again shifting, from the private page to the communal screen, authors will adapt once more. They will increasingly tailor their work to a milieu that the essayist Caleb Crain describes as “groupiness,” where people read mainly “for the sake of feeling of belonging” rather than for personal enlightenment or amusement. As social concerns override literary ones, writers seem fated to eschew virtuosity and experimentation in favor of a bland but immediately accessible style. Writing will become a means for recording chatter.

[…] A striking example of this process is already on display in Japan. In 2001, young Japanese women began composing stories on their mobile phones, as strings of text messages, and uploading them to a Web site, Maho no i-rando, where other people read and commented on them. The stories expanded into serialized “cell phone novels,” and their popularity grew. Some of the novels found millions of readers online. Publishers took notice, and began to bring out the novels as printed books. By the end of the decade, cell phone novels had come to dominate the country's best-seller lists. The three top-selling Japanese novels in 2007 were all originally written on mobile phones.

The form of the novels reflects their origins. They are, according to the reporter Norimitsu Onishi, “mostly love stories written in the short sentences characteristic of text messaging but containing little of the plotting or character development found in traditional novels.” One of the most popular cell phone novelists, a twenty-one-year-old who goes by the name of Rin, explained to Onishi why young readers are abandoning traditional novels: “They don't read works by professional writers because their sentences are too difficult to understand, their expressions are intentionally wordy, and the stories are not familiar to them.” The popularity of cell phone novels may never extend beyond Japan, a country given to peculiar fads, but the novels nevertheless demonstrate how changes in reading inevitably spur changes in writing.”

Similarly, the so-called 'social' behaviour conditioned and reproduced on the online networks could be said to be at least in part produced by these means themselves. In this whole internet-social world, where the interactions between humans which have generally been so consequential in the past are relegated to shadow-presences that can be summoned up or banished with a flick of the wrist and a click of the finger, the broadcast becomes the key point, not necessarily the quality or relevance of the content itself. Yet simultaneously, the image created by the user of a social media profile is often intensively combed, with presentation of an identity (or, as we shall see later, a brand) at least as important as ostensible communication needs. The identity models generally conform to pre-existing roles even if from a widening pool of potential uniforms to wear. “The potential employee deletes last night’s drunken party photos to present a serious tone, while the frat boy eagerly shares photos of the previous night’s debauchery. Moreover, depending on the particular social network, the presentations differ. While “compartmentalization” is something we all have done in civilized social contexts for quite some time, the speed and frequency at which it happens is different. The constant maintenance of how we present ourselves results in a compulsive “need” to “check” everything, seeing what is “happening” on “social media” at all times. There is always something better “happening” elsewhere, whether that be the cool event that we didn’t know about or something “happening” entirely in the digital realm. Consequently, the real “event” may not be the one that we are physically at, but the “conversation” that happens online. “Reality” is increasingly redefined as that which is documentable online, and “conversation” is the “discussion” which happens through social media. Something is always happening elsewhere and we are never really present anywhere (while at the same time, we are stuck in a seemingly ahistorical constant present)” (Points for Further...).

Documentation replaces experience. The self becomes the selfie . Moreover, the celebrated 'connectivity' of the information age seems as often to distance us from one another in real terms as well. Already when acquaintances 'connect' in the virtual world, typed exchanges may even feel more intimate than face-to-face conversations, and thus cause them to disclose things they dare not in actual presence. But the content itself can never be the same, being dis-embodied thus; losing the give-and-take, richness and depth, of real communication. Jason Rodgers perceived as much in the arrival of texting. “Due to the addition of text messaging the cellular communication is trapped between orality and literacy. It has neither the improvisation and open ended nature of spoken language, not the complexity and depth of written language. This contributes to a poverty of language. The exchange is constant, yet nearly meaningless. This poverty of language contributes to a poverty of thought.” The rise of Twitter et al. has only compounded this. Proliferating cameraphones add a visual dimension, and the ascendancy of even the most banal pictures trading currency on Instagram etc. merely spectacularises the fact that every selection and representation is indeed an amputation, the context and specificity shorn. An image can tell a thousand lies, the main one being its own objectivity, it is always a viewpoint from a particular place. The feast for the eye on offer speaks of a dissociation from the depth depicted and the present moment slipping away by the second; yet a dissociation that can pull on our heartstrings in a myriad of predictable, robotic ways.

“The media era is also the era of loneliness,” recognised Jacques Ellul even decades before the ever-present Net fully wove its way into our most intimate 'private' spaces and moments. More than half a century since he wrote on the alienating character of society traversing this technological trajectory, social fragmentation and a concomitant rise in the experience of isolation has travelled hand-in-hand with the arrival of TV, mobile phones, the internet. In 2014, Natalie Gil described loneliness in the U.K. as “a silent plague that is hurting young people most”, in response to studies suggesting that 18 to 34-year-olds surveyed were more likely to feel lonely often, to worry about feeling alone and to feel depressed because of loneliness than the over-55s (who at least have services on the assumption that they will be lonely in modern Western society).

On the other hand from the 'groupiness'-as-euphoria, without the deeper emotional investment and vulnerability of more complicated, in-person relationships, the increased distance and decreased depth that formulates mediocrity and narcissism also facilitates racist, (hetero-)sexist and classist attacks which probably would no longer be attempted so often in person in certain societies. (Perhaps this is significant in allowing a pressure-valve of sorts in the interior of a democratic pluralism which frowns on such statements when in company but is in fact built on a foundation of racial ideologies, gender hierarchies and social stratification, which it must adept and reproduce to itself exist.) The self-aggrandising cruelty of this commentary is constitutive of shifting and often anonymous strands of domination, parallel with what was highlighted in one of Alex Gorrion's essays. “The new apparatuses of social networking also begin to quantify informal power (the very informal power that has always held primary importance, even and especially in the institutions of formal power, which could not work without it) in “likes”, “friends”, and “followers”. But this version of informal power is not the kind created by protagonists, it is the kind produced by a mill wheel set spinning by a hundred chained bodies each chasing after their own loneliness[...]

[These are the lost creatures] who fumble around in smug devices looking for love or distraction. They are children who have never learned to read maps or ask for directions, children whose intimate haunts that they never needed to impose on paper in order to navigate have now been thoroughly mapped by the devices they carry with them. The impoverished oral culture that remains has been forced through this new apparatus.” (We could note that these same children will have been conditioned by what the YoungMinds charity in the U.K. describe as an “unprecedented toxic climate children and young people face in a 24/7 online culture where they can never switch off,” citing cases such as the 2012 suicide of 15-year-old Tallulah Wilson .)

Compelled to Communicate

“The cerebral flattening to the preordained schemas of intelligent machines, the homogenization of the cultures of peoples to the new languages of communications and production are the aim of the new imperialist colonialism. Cybernetic universalism, or multimedia communication, is a tool of the systematic and quantitative reorganisation of the new world order, in the sectors of the market, of capital, of the institutional order and of the territorial infrastructure...” – Pippo Stasi & Karechin Cricorian

While such apparatuses of power dynamics as we described in the previous section could by now be described as to some degree self-regulating and self-replicating, there is certainly a stake held by some of the more explicit institutions in the capitalist order and the nation-state in the new technological phase industrial society has entered. We will come in short order to the tech-industry giants themselves; but what we are speaking of here runs deeper, taking for granted the involvement of such multinational corporations in an ongoing change of such proportions and far-reaching implications for the future, yet penetrating into a tangled complex of statescraft, scientific research and ideology, and perhaps even technological determinism itself.

While it can barely be done justice here, in order to frame the topics which follow, the term 'cybernetics' cannot be far behind. “Cybernetics,” defined Lutz Dammbeck on the conceptual level, “is concerned with how the transfer of information functions in machines and living beings. The basis of cybernetics is the assumption that the human nervous system does not reproduce reality, but calculates it. Man [sic] now appears to be no more than an information-processing system... thought is data processing, and the brain is a machine made of flesh. The brain is no longer the place where “ego” and “identity” are mysteriously created through memory and consciousness. It is a machine consisting of switching and controlling circuits, feedback loops, and communication nodes.” In terms of potential ways to understand how this plays out today (and to trace its background), bear with us through a lengthy quote, where the authors of 'Google Dégage' speculate that “at the same time that the new communication technologies were put into place that would not only weave their web over the Earth but form the very texture of the world in which we live, a certain way of thinking and of governing was in the process of winning. Now, the basic principles of this new science of government were framed by the same ones, engineers and scientists, who invented the technical means of its application [and] laid the basis of that “science” that [the mathematician Norbert Wiener] called “cybernetics.” A term that Ampère [ed. – one of the founders of the science of classical electromagnetism], a century before, had had the good idea of defining as the “science of government.” So we’re talking about an art of governing whose formative moments are almost forgotten but whose concepts branched their way underground, feeding into information technology as much as biology, artificial intelligence, management, or the cognitive sciences, at the same time as the cables were strung one after the other over the whole surface of the globe.

We’re not undergoing, since 2008, an abrupt and unexpected “economic crisis,” we’re only witnessing the slow collapse of political economy as an art of governing. Economics has never been a reality or a science; from its inception in the 17th century, it’s never been anything but an art of governing populations. Scarcity had to be avoided if riots were to be avoided – hence the importance of “grains” – and wealth was to be produced to increase the power of the sovereign. “The surest way for all government is to rely on the interests of men [sic],” said Hamilton [ed. – one of the U.S. 'founding fathers', he established the nation's financial system as well as The New York Post newspaper]. Once the “natural” laws of economy were elucidated, governing meant letting its harmonious mechanism operate freely and moving men by manipulating their interests. Harmony, the predictability of behaviors, a radiant future, an assumed rationality of the actors: all this implied a certain trust, the ability to “give credit.” Now, it’s precisely these tenets of the old governmental practice which management through permanent crisis is pulverizing. We’re not experiencing a “crisis of trust” but the end of trust, which has become superfluous to government. Where control and transparency reign, where the subjects’ behavior is anticipated in real time through the algorithmic processing of a mass of available data about them, there’s no more need to trust them or for them to trust. It’s sufficient that they be sufficiently monitored. As Lenin said, “Trust is good, control is better.”

The West’s crisis of trust in itself, in its knowledge, in its language, in its reason, in its liberalism, in its subject and the world, actually dates back to the end of the 19th century; it breaks forth in every domain with and around the First World War. Cybernetics developed on that open wound of modernity. It asserted itself as a remedy for the existential and thus governmental crisis of the West. As Norbert Wiener saw it, “We are shipwrecked passengers on a doomed planet. Yet even in a shipwreck, human decencies and human values do not necessarily vanish, and we must make the most of them. We shall go down, but let it be in a manner to which we may look forward as worthy of our dignity”. Cybernetic government is inherently apocalyptic. Its purpose is to locally impede the spontaneously entropic, chaotic movement of the world and to ensure “enclaves of order,” of stability, and – who knows? – the perpetual self-regulation of systems, through the unrestrained, transparent, and controllable circulation of information. “Communication is the cement of society and those whose work consists in keeping the channels of communication open are the ones on whom the continuance or downfall of our civilization largely depends,” declared Wiener, believing he knew.

[...] Officially, we continue to be governed by the old dualistic Western paradigm where there is the subject and the world, the individual and society, men and machines, the mind and the body, the living and the nonliving. These are distinctions that are still generally taken to be valid. In reality, cybernetized capitalism does practice an ontology, and hence an anthropology, whose key elements are reserved for its initiates. The rational Western subject, aspiring to master the world and governable thereby, gives way to the cybernetic conception of a being without an interiority, of a selfless self, an emergent, climatic being, constituted by its exteriority, by its relations. A being which, armed with its Apple Watch, comes to understand itself entirely on the basis of external data, the statistics that each of its behaviors generates. A Quantified Self that is willing to monitor, measure, and desperately optimize every one of its gestures and each of its affects. For the most advanced cybernetics, there’s already no longer man and his [sic] environment, but a system-being which is itself part of an ensemble of complex information systems, hubs of autonomic processes – a being that can be better explained by starting from the middle way of Indian Buddhism than from Descartes [ed. – see ’A Profound Dis-ease’]. “For man, being alive means the same thing as participating in a broad global system of communication”, asserted Wiener in 1948.

Just as political economy produced a 'homo economicus' manageable in the framework of industrial States, cybernetics is producing its own humanity. A transparent humanity, emptied out by the very flows that traverse it, electrified by information, attached to the world by an ever-growing quantity of apparatuses. A humanity that’s inseparable from its technological environment because it is constituted, and thus driven, by that. Such is the object of government now: no longer man or his interests, but his “social environment”. An environment whose model is the smart city [ed. – see Return Fire vol.3 pg31]. Smart because by means of its sensors it produces information whose processing in real time makes self-management possible. And smart because it produces and is produced by smart inhabitants. Political economy reigned over beings by leaving them free to pursue their interest; cybernetics controls them by leaving them free to communicate.”

In this light, what would our enmeshment in the circuits of the world of the web (and not only) tell us about our propensity to become governable; even (or especially) as we take this access to be evidence of our freedoms, our connections, our selves?

These are not popular questions to ask in today's climate in the West, let alone hazard answers to. Yet some qualms, if undeveloped as yet, can be perceived in even the popular culture, such as the thoughts of novelist Benjamin Kunkel. “The internet, as its proponents rightly remind us, makes for variety and convenience; it does not force anything on you. Only it turns out it doesn't feel like that at all. We don't feel as if we had freely chosen our online practices. We feel instead that they are habits we have helplessly picked up or that history has enforced, that we are not distributing our attention as we intend or even like to.” More dominant, though, is an enduring belief that these vaunted new technologies not only can be understood as separate from the institutions and ideologies from which they emerged; but that they are in some way inherently 'progressive', liberatory even. Among the ranks of these techno-utopians (or at least among those who consider technologies to be inherently value-free and neutral) can be found not a few staunch critics of capitalist social relations, and maybe even of the State-form itself. Now would seem as appropriate time as ever to turn our weapons on these arguments.

Updated Illusions

“The truth is that technology magnifies power in general, but the rates of adoption are different. The unorganized, the distributed, the marginal, the dissidents, the powerless, the criminal: they can make use of new technologies faster. And when those groups discovered the Internet, suddenly they had power. But when the already powerful big institutions finally figured out how to harness the Internet for their needs, they had more power to magnify. That’s the difference: the distributed were more nimble and were quicker to make use of their new power, while the institutional were slower but were able to use their power more effectively. So while the Syrian dissidents used Facebook to organize, the Syrian government used Facebook to identify dissidents.” – Power in the Age of the Feudal Internet

Never before has such a hoard of data existed on so widely-accessible platforms concerning the aspects of the world today we might consider to be horrors. Rapes, climate-induced flooding [ed. – see Return Fire vol.2 pg15], hostage beheadings, industrial 'disasters' [ed. – see Return Fire vol.1 pg28] and police violence come tumbling out of our news-feeds and video-tubes, circumventing censorship and State borders. And yet never has so little been done relative to the immensity of the dangers we face. On the one hand, some positively see the potential for this visibility to spark revolts against whatever atrocity in question, rebellions of the type that have not been lacking throughout pre-digital history [ed. – see Return Fire vol.3 pg87], if yet to be decisive. On the other hand, others see the mere existence of this 'democractisation of information' as a counter-balance to the excesses of our rulers. Both seem to rest on an assumption which we ourselves do not find to be true: namely, that there is a simple causal relationship between information and action. However, another angle to take would be that uprisings continue to exist despite the prevalence of digital media (including their protagonists' own use of it) not because of it; and that the feast of information famishes our appetite to weaponise and make use of it, to make it our own.

For example, the online patterns of media consumption seem geared in the opposite direction to reflective engagement. A study some years ago reported that most web pages are viewed for ten seconds or less. Fewer than one in ten page views extended beyond two minutes, and a significant portion of those seemed to involve unattended browser windows left open. And as mentioned above, when the floodgates of information overload are running full-steam, if you don't have time, or make time, to live with that information, to reflect on it, it can simply have a numbing effect, or tend towards imparting pre-packaged options rather than critical thinking. How often do we come across some ostensibly exciting or horrifying case, or convincing or intriguing argument, online; only to promptly forget all about it until we are reminded again while back online? Obviously this isn't the case in every instance, but its regularity should tell us something about how little this 'information' is finding ways to sit in our daily lives, when it is so hard to find time and space to make use of it – and specifically to make use of it with any depth of reflection. Combined with a 'social' life increasingly consisting of remotely exchanging banalities, the result is often individuals sitting alone staring into screens, 'Liking' topics that momentarily engage them or events they may or may not attend, then going to bed. Even when we do meet face to face, it sometimes feels harder to practice our being-together, to develop a tangible sense of encounter and openness not defined by the exigencies of our mediated communications (texts, tweets, comments, etc.).

The results are visible in many of the modern so-called 'social movements', which often feature highly tech-savvy elements perceived by some to be important or even pivotal aspects of whatever struggle. This affects many on-the-ground activities, from banners and placards made more for the camera than street-level communication, reduction of dialogue between participants and bystanders to that of promoting a specific hashtag, and further 'dumbing-down' of ideas in order to produce text for leaflets that can easily be 'scanned'. Whatever creativity and spontaneity remains in moments of contestation is domesticated on the spot via the reduction of whatever intervention into representational data to be broadcast via the media, however self-published. Again, the platforms themselves alter the way struggles are conceived and received, regardless of the content, and the more dependent movements become on them the less likely they seem to be to criticise them. Kevin Tucker looked back on the beginnings of this shift (in North America at least) in his eyes. “Through the anti-globalization movement and street riots that take root in the late 90s through the 2000s, you saw this element of involvement form into spectator roles. There was a change in focus on taking part in resistance to documenting everything. Suddenly Indymedia [ed. – independent self-publishing platform formed originally to facilitate and communicate action against the World Trade Organisation summit in Seattle, U.S.A., 1999] was the focus. There were certainly pros to it, but at the time it felt like it stole the spotlight a bit. In hindsight, it absolutely did.

And it made sense in a way, as repression raised[,] the need to document it was important. But in some ways we made the documenting the story, not the means. The spread of the internet was really the necessary piece of the puzzle to make that happen. I’m not sure if you can say it’s coincidental or not, but there’s a mirroring of shifts within the milieu and the culture at large towards a more internet savvy approach to radicalism.”

What kind of movements are created through such a shift? How are they different from what came before? These were the questions asked by Zeynep Tufekci, after she identified their lack of attention-maintenance and staying power. “The boom and bust cycle of consciousness-raising and resignation may only be a phase in the life of networked social movements. Or, it may be their distinct feature. […] Digital infrastructure may be said to follow a trajectory common to other disruptive technologies. Governments’ initial waves of ignorance and misunderstanding quickly gave way to learning about the medium’s strengths and weaknesses, as well as the development of new methods to counter dissent. However, changes to a movement’s capabilities that broaden its ability to coordinate actions or to publicize its cause are real as well. [...] Social media have greatly empowered protesters in three key areas: public attention, evading censorship, and coordination or logistics. Old forms of gate-keeping, which depended on choke point access control to few broadcast outlets, neither work as effectively nor in the same way as they did in the past. Digital technologies provide a means by which many people can reach information that governments would rather deny them. Street protests can be coordinated on the fly. However, this does not mean that social media have exclusively empowered protesters; they have also aided governments and other factions of society by providing them with tools they can also use to their advantage. […] By allowing protesters to scale up quickly, without years of preparation, digital infrastructure acts as a scaffold to movements that mask other weaknesses, especially collective capacities in organizing, decision-making, and general work dynamics that only come through sustained periods of working together.

[…] Hence, digital technologies certainly add to protester capabilities in many dimensions, but this comes with an unexpected trade-off: Digital infrastructure helps undertake functions that would have otherwise [required] long-term organizing which, almost as a side effect, help build organizational capacity to respond to long-term movement requirements. Working together to take care of the logistics of a movement, however tedious, also builds trust and an ability to collaborate effectively. Consequently, many recent movements enter into the most contentious phase, the potential confrontation with authorities, without any prior history of working together or managing pivotal moments under stress.” After looking to the insurgencies of Turkey, 2013 [ed. – see Return Fire vol.2 pg48], and in the so-called Magreb, 2011 onwards [ed. – see Return Fire vol.2 pg87], she used the analogy of the 1963 March on Washington during the U.S. civil rights movement. “Once the march happened, it was no longer just a march of thousands of people, but rather, it signaled to those in power that an organizational capacity could threaten their interests[...] In contrast, the massive Occupy marches that took place globally in over 900 cities on 15 October 2011 dwarfed most historical precedents in terms of size, yet were organized with approximately two weeks’ notice [but] without similar organizational capacity. While this appears a shortcut for protests, it also engenders weaknesses, as these protests do not signal the same level of capacity as previous protests, and do not necessarily pose the same threat to governments and power.”

Moveover, for those of us less interested in being boxed in and defined by whatever social movements our actions are unavoidably in the context of, it is harder to avoid exactly such an enclosure. Relatedly, the text 'Fighting in the New Terrain' touches on the way that “the internet has transformed anonymity from the province of criminals and anarchists into a feature of everyday communication. Yet unexpectedly, it also fixes political identities and positions in place according to a new logic. The landscape of political discourse is mapped in advance by URLs; it's difficult to produce a mythology of collective power and transformation when every statement is already located in a known constellation. A poster on a wall could have been put up by anyone; it seems to indicate a general sentiment, even if it only represents one person's ideas. A statement on a website, on the other hand, appears in a world permanently segregated into ideological ghettos.” Once more, this finds resonance in 'Point for Further Discussion...': “The rather laughable digital utopianism has proven to be untrue – we haven’t arrived at an equal society as a result of equal access. Even in the best cases of open source tools, their challenge is a drop in the bucket and they can often be just as easily mobilized towards non-liberatory ends. Moreover, the Internet and computer technologies have contributed to a situation of information overload and the fragmentation into a seemingly unlimited number of different identities, making it harder than ever to be seen on the digital networks, arguably the ultimate goal. Added to this, the increasing fragmentation and personalization – enabled through sophisticated forms of behavior and browser tracking – assure that there is no universally accessible network that one can simply have access to, but rather a series of largely closed and overlapping networks. These technologies extend the logic of computers into all realms: success is the documentable and quantifiable number of “friends” or “connections” we have on various sites, future activity, preferences, and “personalization” are predicted by algorithms informed by massive amounts of stored personal data, and everything is ranked and rated.”

To address those who feel that the mere existence of information in circulation constitutes an effective check on those in power; information is weightless without the will and ability to make something out of it, contrary to the narrative of truth-as-power promoted by, say, the Wikileaks case [ed. – see Return Fire vol.3 pg48]. Video footage taken of the police, as another example, can help them refine their public image by limiting them from doing things that look bad in the representational game of liberal democracy. But that's different than actually enabling people to take action that would change the power differential, and has in some cases been used to strengthen their case for the increasingly-present bodycams they wear, leading to a further intensification of surveillance at points of potential confrontation. These days we are endangered additionally while confronting our enemies by the plethora of mobile filming devices wielded by members of the crowd, most of whom will not be as obliging as those the Mi'kmaq warriors and their allies requested to turn of all such equipment before torching the police cars forcing further extraction prospecting on their territories [ed. – see Return Fire vol.2 pg61].

Another argument used in favour of utilising digital platforms during social movements, often to the detriment of more embodied communication and encounter, is that whose who don't engage in that way will be 'left behind' the (real or imagined) 'masses' who are attentive to whatever issue in question. That's as may be (though such thinking clearly prioritises quantitative aims, i.e. the amount of people 'reached', over qualitative factors such as the depth of the communication and the solidity of any affinities discovered), yet it would seem a danger in 'catching up' via uncritical engagement is also advancing the evolution of digital media out of our hands. The ubiquitous and mostly either banal or highly-toxic comments sections many websites now host started out as an innovation of the Indymedia network, while the SMS text messaging program developed by the Institute for Applied Autonomy for protests at the Democractic and Republican National Conventions served as a model for Twitter.

Ironically, given all the talk about the diversity offered by the internet, many anarchists and (other) radicals – even many who reject digital optimism – seem compelled to opt for the convenience of the all-encompassing Facebook et al. in the 'informational mainstream' above autonomous channels. This largely seems to facilitate continuing ghettoisation of radical critiques into just another identity niche online, another status in your profile, and accelerate the further fractioning even within these critiques into a series of silos in which one can be confident they will hear only voices similar to their own .

Rather than bask in the escape from the artificially-narrow debates which have characterised mass media paradigms in the years gone by (in many ways having been the glue that held the democracies of latter modernity together) – which social media indeed moves away from – we would do well to think about how the production of opinions still takes place in this new democratic terrain. As we've seen in past weeks, a candidate can win the U.S. Presidency despite the hostility of almost all mass media nationally, suggesting that social media platforms now command higher influence than these institutions. But of course, rather than signifying any kind of horizontalism or levelling of power, enormous disparities in influence, presence and resources continue to characterise the social network terrain, making it perhaps more accurate to describe as a polycentralisation of these spheres rather than decentralisation. More to the point, the ideology of democratic pluralism which these technological platforms sit comfortably within declares any opinion (liberal, conservative, anarchist, feminist, capitalist) to be equally valid – so long as it remains just that, opinion. Hence the departure from a central stage of social discourse and 'fact production' actually in this case speaks of a further atomisation – these various online niches never need cross one another, people are used to any opinion having a homepage and set framework and thus actual debate and contestation of ideas (i.e. tools, toys or weapons we might take in our hands and actually use) becomes more difficult or ephemeral. Rather than (for the most part) censor online activity, today's and tomorrow's democracy assuages which demographics hold what influence, bring which votes, generates how much advertising revenue and occupies which consumer niche. Alienation has actually deepened in this context: from experience it would seem that the more fertile spaces for building subversive relationships with an inclination to actually act on our conditions in fact come from disputing different ideas about the world and how we might inhabit it. By annulling space that could give rise to such conflicts and hence potential deepening of analysis and affinity, the web leaves us weaker.

“What I hate about the Internet, of course,” identifies Aragorn!, “is that it has quickly moved from a decentralized cacophony of voices, perspectives, and mediums for transmitting different ideas, into a channeled, mediated, controlled, and censored medium replicating most of the media flaws that lead to the popularization of the Internet in the first place. In the context of the anarchist internet this means that the first wave of anarchist controlled internet [sites] have almost entirely disappeared. Anarchist Internet discussion has almost entirely moved to Facebook and/or the ephemeral snapchat, instragram, and twitter contexts.” Sure enough, despite commendable online initiatives (some by him, as well as others) attempting to buck this trend, the atmosphere that accompanies most 'radical' conversational spaces online is one of cynicism, self-policing or total thoughtlessness, with 'winning the argument' by whatever means seemingly taking precedent over all else. “Within a few short years, the internet comment forum transformed into a repressive apparatus,” observed the text 'Robots of Repression', “albeit democratic par excellence. With nearly everyone taking part, internet comment forums created and used within anarchist struggles have become acceptable spaces for the intensification of sectarian divisions based on barely a shadow of critical difference, the proliferation of superficial or aesthetic affinities, snitch-jacketing, rape-jacketing, the publishing of legally endangering information, the compromising of anonymities, the erosion of solidarity and its replacement with flippancy and instant gratification, and a deepening of the culture of TLDR [Too Long; Didn't Read].”

Even if social network sites and comment boards fail to ensnare us, it's just as easy to allow oneself to become intoxicated by the update stream of the specifically-anarchist online media. Our contemplative and creative ways, which have at times distinguished anti-authoritarian interventions in aspects of social life, succumb to the constant hum of the information exchange (often hyping formulaic and under-contextualised events/actions), and we become much like many other surfers experiencing momentary thrills on their topic of choice. This is perhaps an under-evaluated part of the conceptions of 'anarchisms of action' (often with many exciting qualities, to be sure) which has come to the fore in recent years. Aside from the perfectly evident strength which often comes from recognising hearts in some more-or-less distant part of the world beating to a similar rhythm to our own, it's useful to question what effects the dominant cultural 'groupiness' feelings this inculcates in us too via these mediums can have on our struggles. Maybe never before have we 'performed' on a stage where the 'audience' is so many (and often probably so exclusively) other anarchists, even if none exist locally, rather than primarily inhabitants of whatever social environment we frequent.

While we recognise that complex factors both cause and result from our actions – as well as accepting the socialised or perhaps even just all-too-human subliminal drive for recognition – and thus feel no need to ascertain 'pure' motives to act, we should be conscious of the potential for such actions to be taken mostly for the sake of being able to participate in a virtual arena by claiming them. Or at least, when this is to the exclusion or detriment of attempts to affect our more daily surroundings and conditions.

At what point does it become less about spreading signals of solidarity to bolster an actual projectuality, or descriptions of methods used – which all strengthen us in real-world struggle – and more a question of self-gratifying web-games? Clearly this must be evaluated on a case-by-case basis, without generalisations, but we think that Antonio Antonacci [ed. – see Return Fire vol.3 pg71] might have meant something of the kind when he said that “[p]ersonally I have several concerns on projectual aims and spectacular propaganda. Even if I recognize that these can have some potential, I also think that they belong to the society of appearance, based on nothing and immersed in a time of hyper-information where the centralization of the will to communicate, or an excess of communication, risks creating confusion and degenerating into exaltation as an end in itself.” This new terrain feels seductive, and doubtless holds some potentials; and anyway, like it or not, it is the wider sea many of us now swim in. In part of their written contribution to a 2013 gathering at the Nadir anarchist space in Thessaloniki, Greece, on the topic of anarchist 'counter-information' structures to disseminate action claims, news and analysis, the administrators of 325.nostate.net argued that “we believe that the information war is a defining operational environment for the anarchist new urban guerilla as much as the metropolis or the border between the urban and rural areas was for revolutionaries of the past.

[…] We want to make it very easy for those who hear of the direct actions via the mainstream media to easily find the communiques and context for the attacks, and for the informal counter-information groups to be able to grow and steadily produce the environment for widespread subversion. The access to information must be turned into a weapon against the system, which relies on its dominance of the media.” Yet later the same paragraph admits that “[n]ot only is the new media environment increasingly self-published, it's able to take in and assimilate all points of view, even realities of attack.” In which ways does this interlace with the aforementioned tendency towards democratic assimilation and ghettoisation? How can we maintain a presence to provide context for actions and such in the digital realm, while minimising the degree to which it is merely assimilated as another 'edgy' aesthetic for a distinct class of viewers, and robbed of its proper repercussions? It would indeed be a wasted opportunity if, when conditions hint at chances to push any uncontrollable situations into a direction amenable to the experimental forms-of-life we want to realise but perhaps also generalise [ed. – see Return Fire vol.2 pg19], the dialogue we were most familiar with was publishing self-promoting texts to each other via the Net.

Yet increasingly this would seem to be many people's entry-point for what it is that certain types of anarchists do, as well as the bar for participation. This was a point highlighted in one issue of the Aversión paper: “Internet forces you into constant updating and everything is done at a speed well beyond human capabilities. What’s the point in knowing what happens all over the planet in real time? Our ability of intervention within our nearest reality is very limited in itself. Up to which point does this produce the same anxiety deriving from the speed with which, for example, technology and fashion change, thus losing their previous value and meaning? […] Many of us became anarchist by participating in talks, writing letters to prisoners, reading pamphlets, visiting anarchist libraries, subscribing to periodicals from the other side of the planet, discussing with old saboteurs and fighters, etc… But at the moment formation occurs mainly through blogs and social networks. […] It seems that today internet includes many aspects of our existence and profoundly affects human relations, thus contributing to isolation, atomization and alienation.” In other words, as many people now 'learn' their anarchism from Wikipedia, forming their ideas from representations at a degree or few of removal from the actual lived complexities of attempts to live inside them, they are radicalised on a terrain only marginally within our actual influence; the form in some ways contradicts the content. Our question must be; in which ways does the Net open up space and in which does it enclose us? In which does it aid self-creation and inspiration, and which entail mere enlistment, or an online space to mouth off discontent to our own demographic?

Upon announcing their resignation from maintenance of the online source anarchistnews.org, 'Worker' observed that “[i]t used to be that anarchism (the set of people who use the term) was filled with a bunch of people who did things. Since the rise of the Internet this has become increasingly NOT the case. My greatest disappointment in running anarchistnews.org is that it has witnessed this degradation of interesting activity of anarchists. The Internet does not inform interesting activity, it kills it stillborn. Most new anarchists fear the attention of the broader anarchist community because it almost never comes off as supportive (and when it does it tends to be in the style of NGO shit sandwich [compliment-insult-compliment] rhetorical kindness). The Internet is now at the center of how we communicate with each other and it means our communication is worse than ever.

While I was not particularly naive about what I should hope for when I started anarchistnews.org I did not realize how powerful the medium of the Internet would become in terms of shaping everything that happened here. It is nearly impossible to start a new DIY website in 2015 and have it noticed beyond your social scene. The big players absolutely dominate what is talked about and I am not motivated to play that part of the modern media game. I find Facebook, Twitter, etc to be absolutely repulsive and, while I use them, I can't support their use and see them as utterly opposed to our project here.” Currently, exactly these corporate platforms are entrusted by a large proportion of general dissidents with the kind of personal information which even the less paranoid among them would never entrust so readily to a national authority. Now we move to the consequences that no radical should be able to treat as a non-issue when internet technologies define so much of our reality: the landslide policing advances they offer.

Inviting Big Brother In

“Computer systems are not, at their core, technologies of emancipation. They are technologies of control. They were designed as tools for monitoring and influencing human behavior, for controlling what people do and how they do it. As we spend more time online, filling databases with details of our lives and desires, software programs will grow every more capable of discovering and exploiting subtle patterns in our behavior.” – Nicholas Carr

As if it needed saying, our enemies are also active in the digital field in many forms. Tellingly, one of the first people to actually be targeted in Spain by the new (and much-protested) 'Public Safety Act', known colloquially as the 'gag law', was a salesman on Tenerife who chastised the police on the mayor's Facebook wall for being “slackers”. Within six hours of hitting 'send', police were knocking on his door, despite his protests that he wasn't a “perroflauta” (hippy/tramp) like those in the social movements the law was presumably drafted against . More direct interventions against the organisational capacity associated with the new technologies include shutting down service to iPhones and the like within a 'protest area' (similarly to when phone signal for a particularly conflictual part of Berlin was cut during the annual May 1st mobilisation of 2010), but often it seems more in the authorities interest to monitor such situations than impose a disruption – hence the appearance in the U.S. of white single-engine planes circling flash-points such as Ferguson [ed. – see Return Fire vol.3 pg76], Baltimore [ed. – see Authorities Finally Confirm Stingray (IMSI) Use in Prison Island – in Scottish Prisons] and most recently Olympia during a brief railway blockade to hinder fracking components reaching North Dakota’s Bakken oil fields in solidarity with the Standing Rock camp [ed. – see Special Hydraulic Fracture]. These are thought to be used by the FBI to suck up all cellular communications within their range, presumably for real-time sorting and analysis. The military are naturally attendant to the implications for warfare in the information age and the increasingly asymmetric conflicts of the present day. In a very tangible sense, this already takes forms such as the three U.S. guided munitions which destroyed an alleged ISIS headquarters less than 24 hours after the division tasked with combing social media picked up someone's bragging selfie within the base and triangulated from there. But, as General Nick Carter proclaims as part of the drive to make the British Army he heads 'smarter', contemporary military formations recognise that “the actions of others in a modern battlefield can be affected in ways that are not necessarily violent and [new strategy] draws heavily on important lessons from our commitments to operations in Afghanistan amongst others.” Indeed, 'digital warfare' is described as central to British Army operations during this period, with 1,900 extra security and intelligence staff recruited. Two “innovative brigades” consist of regular and reserve troops with expertise in offensive and defensive digital warfare, warriors who don't just carry weapons, but who are also skilled in using social media, and the dark arts of 'psyops' – psychological operations. In this we see the trend towards a blurring of military and policing functions in their 'classical' senses, as part of a trajectory of generalised counter-insurgency [ed. – see Return Fire vol.3 pg12].

Clearly any use of digital tools becomes at the very least a double-edged sword; as people flee from the aftermath of those lauded 'Facebook revolutions' in the Arab world and beyond, since 2015 the European transnational police force Europol started a fresh partnership with the major social media sites to scan for any suspected agents facilitating this flight, under the supervision of none other than the European Counter-Terrorist Centre. To state the obvious, such platforms are in certain terms a godsend to intelligence agencies compared with the work they would have had to do in days gone by to infiltrate target groups. (Narrowing down which individuals to actually target out of the millions is another matter, but it can't be said that the authorities have had no success in this regard, perhaps as the science of network analysis combines with older intelligence efforts.) It's rare these days for governments to attempt the kind of autocratic internet shutdowns (such as the one that saw the last days of the Mubarak regime in Egypt) during social upheavals – though not unknown, as was the case in the capital of the Democractic Republic of the Congo during 2015 anti-regime clashes – when this so clearly furthers the experience of rupture with daily normality and harms economic activity. Perhaps some tweeking is in order, like the trolling footnoted above or the almost complete absence of news about the Ferguson uprising Tufekci reported on her Facebook feed algorithmically-editied for 'personal relevance' (while there was apparently no other subject on Twitter), but the fact of the matter is that these tools are as apt for re-stabilisation as de-stabilisation. See for example the Twitter mobilisation that brought out the volunteers armed with their brooms to sweep away the aftermath of the 2011 riots in London [ed. – see Return Fire vol.1 pg61], coordinated by CrisisCommons, a “global network of volunteers working together to build and use technology tools to help respond to disasters and improve resiliency and response before a crisis”. The 'self-organisation' facilitated by these technologies is in no way inherently liberatory.

Ruling parties, corporations and institutions must themselves be adept at playing the social media field, and playing it to their advantage. After those 2011 uprisings across England, the director of its Police Foundation published a piece on the blog of British Telecom (BT). “Moving from a more traditional and stable society to a much faster, consumer-oriented world creates many challenges for the Police. People become disconnected from the communities in which they live and, ultimately, from each other.

This sense of disconnection leaves people feeling insecure which in turn contributes to fear of crime and anxiety about incivility in public spaces. In a world where the rule of law, equality before the law and respect for rights and freedoms provide the glue for a fragmented society, they become ever more essential in sustaining the principle of policing by consent. If the public trust the police as legitimate authority figures, they are more likely to comply with the law and to engage with their community, coming forward to report concerns and wrongdoing.

These challenges formed the opening session of the second annual Police Foundation Conference, ‘Police Effectiveness in a Changing World’, which took place at the BT Centre last Wednesday. It was opened by Stuart Hill, Vice President of Central Government and Home Affairs for BT and included a stellar line up of speakers, including Professor Sir Anthony Bottoms [influential criminologist], Shami Chakrabarti [politician and member of the House of Lords], Sara Thornton [then Chief Constable of Thames Valley Police], Nick Herbert [then Minister of State for Police and Criminal Justice] and Nick Gargan [then Chief Constable of Avon and Somerset Constabulary].

Seldom, if ever, have the police been under such scrutiny – both in a social and a political sense – and it’s widely accepted that they need to protect their operational independence, resisting any political pressure to solve social problems.

They need to use the power of communications and social media to their advantage, working with these innovations rather than against them. The recent riots highlighted how protesters could use social media to move more freely and speedily than police units so a logical response is for forces to establish a Twitter presence and use the medium to gain the trust and confidence of followers.” After that spell of disorder itself, not a few suspected rioters saw prison as a result of their social media activity (or even just those who glorified and advocated for the like online, receiving sentences of 2-4 years for Facebook posts).

While the keyboard brazenness of some British insurgents or their admirers from those days perhaps could be partly put down to inexperience and naivety about police monitoring, it is mystifying why many with a greater exposure to criticism of the surveillance State are not more adverse to such exposed platforms. In 2012, the Nadir tech-collective noted the same thing; “having worked for years – and sometimes [earning] a living – with the net and with computers, system administration, programming, cryptography and lots more, Facebook comes as something like a natural enemy. [...] We just hadn't realised that, after all the stress out on the streets and all those lengthy group discussions, many activists seem to have this desire to prattle at length on Facebook about everything and with everyone. We hadn't realised that [the activist] along with everyone else enjoys following the subtle flow of exploitation where it doesn't seem to hurt and, for once, not having to resist. Many people suffer from a bad conscience. While this may lead them to anticipate the fatal consequences of Facebook, it does not seem to translate into action. Is it really ignorance? Just to give a short outline of the problem; by using Facebook, activists do not just make their own communication, their opinion, their 'likes', etc. transparent and available for processing. Instead – and we consider this far more important – they expose structures and individuals who themselves have little or nothing to do with Facebook. Facebook's capability to search the net for relationships, similarities etc. is difficult to comprehend for lay people. The chatter on Facebook reproduces political structures for the authorities and for companies. These can be searched, sorted and aggregated not just in order to obtain precise statements regarding social relations, key people, etc., but also in order to make predictions, from which regularities can be deduced. Next to mobile phones, Facebook is the most subtle, cheapest and best surveillance technology available.

[...] That is why we see Facebook users as a real danger for our struggles. In particular, activists who publish important information on Facebook (often without knowing what they are doing), which is increasingly used by law enforcement agencies. We could almost go as far as accusing those activists of collaborating. But we're not quite there yet. We still have hope that people will realise that Facebook is a political enemy and that those who use Facebook make it more and more powerful. Activist Facebook users feed the machine and thereby reveal our structures – without any need, without any court orders, without any pressure.” The same year they wrote these words, police based their round-up of Bolivian anarchists, syndicalists and feminists largely on information from Facebook profiles [ed. – see Return Fire vol.2 pg68], and five anarchists were jailed in Spain for 'membership of a terrorist group' based on their involvement on Facebook groups. Continuing from their contribution to the gathering in Thessaloniki, 325.nostate.net underline the “urgent and serious need for the insurrectional groups and individuals to stop using regular corporate services (i.e. Yahoo, FaceBook, Gmail, Hotmail, Wordpress, Blogspot, etc.) and learn about basic computer security. This task is urgent for anarchists in all countries but especially those with significantly repressive regimes. These companies will immediately co-operate with the authorities at the slightest excuse/pressure. This must be replaced as much as possible with movement services and encryption. From as early as 2003, at an anti-prisons gathering in Barcelona, it was confirmed by a lawyer of the movement that the European police and security services were using the internet corporations to identify, spy, track and monitor anarchists using their services. This has enabled Europol and the various state police services access to vast amounts of analysis data concerning location, content, who-talks-to-who etc. Anarchists are being systematically targeted by the security services through the software they rely on for communication/publicity and we should aim to prevent, as much as we possibly can, their ability to disrupt us. The authorities aim to turn our use of the internet into a weapon against us, through IP [ed. – Internal Protocol address, identifying the location, technical details and service provider of an internet connection] tracking and dataveillance, leading to our prosecution – or attempted neutralisation.”

Already in France, opening 'terrorist internet pages' can get you two years in prison, while in 2013 the administrators of the anarchist web portal non-fides.fr were accused of “public defamation of public officials” and “incitement to the commission of an attack against a person without effect” for spreading a text denouncing the Parisian 'night correspondents' . (Both comrades refused to cooperate or voluntarily appear for hearings, or give fingerprints, DNA and biometric photographs, stating “we know that this affair is only a pretense for the pigs and the courts to further harass us, after having thrown us in prison for some months in 2011 for another affair , and after about three years of various almost-uninterrupted legal monitoring, during which we theoretically could not see each other, nor leave the country, and were required to check in with the police every week and pay a ransom of €4,000 to the state. All these measures (that affect us as they have impacted other comrades before us and tens of thousands of people everywhere) aim to break us, by isolating each of us from the other and isolating us both from a movement, but also by breaking dynamics of struggle.”)

As cited in the anonymous 2011 text 'Desert', “[a]ccording to a UK military mid-term future projection: “By the end of the period [2036] it is likely that the majority of the global population will find it difficult to ‘turn the outside world off.’ ICT [information and communication technology] is likely to be so pervasive that people are permanently connected to a network or two-way data stream with inherent challenges to civil liberties; being disconnected could be considered suspicious.” We are moving to such a future fast. When the French anti-terrorist police invaded the land community in Tarnac in 2008 [ed. – see Return Fire vol.3 pg58] one of the public justifications they gave for suspecting that a terrorist cell was forming was that few on the land had mobiles!

The agreed convention is that the first step for those who, having planned the future, now wish to bring it about is to make oneself known, make one’s voice heard – speak truth to power. Yet “the listener imposes the terms, not the talker” ['Silence & Beyond']. Much of the low-level contestation that characterises activism, and the limited social spaces that make up counter-cultures, actively mark out areas, and people, in need of potential policing. That’s not to say that all resistance is futile [nor