In the early 1990s, without a real state-based threat, ideas about “net-centric” warfare became popular, with the claim that information effects would be the next Revolution in Military Affairs. In order to capture an advantage of increasing knowledge about the security environment, several pre-doctrinal planning constructs were considered by the Pentagon: Effects-Based Operations (EBO),[4] System-of-Systems Analysis (SOSA), Operational Net Assessment (ONA) and Systemic Operational Design (SOD). Each posited an effects-based focus with distinctions, but few differences. The panacea that all offered was that the engineering problem of automated data collection via sensors would more effectively translate the “meaning” of military activity into effects on the battlefield. However, the promise bore little fruit. It’s not a new challenge: the problem of attempting to “loop” tactical tasks into forward-leaning strategic impact has a long history.[5] The confusion led then-General James Mattis to eliminate effects-based language while commander of the now de-commissioned Joint Forces Command.[6]

Fast forward beyond the immediate post-Cold War defense waffling and onward to the emergence of non-state actors exploiting weak-state vacuums. The hardware dominance argument remains and the information wave prophesied by McLuhan and engineered by Shannon is here. Today, the US military continues to insist that “real” preparation for war is with an imaginary near-peer who can never be invaded, and whose territory can never be held. Yet multi-domain battle, the Army’s stumbling modernization program, and countless other corpses cast against the acquisition and budget process all have one thing in common: primarily a focus on 20th century territorial war to the exclusion of 21st century connectedness. There is a reason for this.

War is a Racket

“The Napoleonic era defined the last great leap of warfare for the West in conceptual terms: largely rationalising it through a financial prism, dominated by the doctrine and tactics of Carl von Clausewitz and Antoine-Henri Jomini, and considering the time and space of a conflict as limited by time and space (the more limited the better).”[7]

Let’s reduce war to what it ultimately is: killing and breaking things. Do information effects even matter? Two classical answers to this question are: information by way of signaling can prevent a war and signaling during warfare may reduce casualties through surrenders, peace negotiations and misdirection during maneuver. In the context of maneuver, intelligence about the enemy and deception are often opposite sides of the same coin. Accordingly, deception uses information to increase or decrease the perception of risk in a course of action. To that end, what utility does the military gain from information effects? The non-kinetic application of information by the US military, when not involved in areas of declared hostility, has a checkered past. And still, war is fundamentally preceded, ended, and carried out by flows of information. What is information?

The non-kinetic application of information by the U.S. military, when not involved in areas of declared hostility, has a checkered past.

The history of information since the Second World War is, essentially, Shannon information. Claude Shannon’s reduction of communication and information to an engineering problem was a first, and the ensuing discussion collapsed into two parts: communication versus data.[8] Shannon information is about the uncertainty of the flow of the next communication and how machines most efficiently process that uncertainty. Communication, classically, is concerned with “meaning”, and is conflated with data when Shannon says: “The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point.” James Gleick writes about how the change was confusing once it was introduced, as it seemed to change the definition of communication itself:

Von Foerster, like Margaret Mead and others, felt uncomfortable with the notion of information without meaning. “I wanted to call the whole of what they called information theory signal theory,” he said later, “because information was not yet there. There were ‘beep beeps’ but that was all, no information. The moment one transforms that set of signals into other signals our brain can make an understanding of, then information is born— it’s not in the beeps.”[9]

Now imagine the confusion as the typical definition of war - the continuation of politics by other means - becomes confused with the amount of both data (information) and communication (the challenge of agreeing on ‘meaning’) across the globe. Unlike those planning constructs of the 90s and early 2000s (EBO, ONA, SOD), the present volume of information (signaling) itself can make communication difficult and rarely reproduces meaning exactly as it is intended at another point. So it is much easier to send a message with hardware - it’s an age-old, tried-and-tested method that has prompted “might makes right” arguments throughout history. Protests against legitimacy by force, viewed live at any point when they occur, demonstrate the challenge for states in maintaining a monopoly over the use of force. When the definition of force can be extended to the use of information in forcing changes to a social contract vis-a-vis the brute force of arms, then information effects matter for nation-states.

Measures Short of War

“World War III is a guerrilla information war with no division between military and civilian participation.”[10]

Today’s nation-state is faced with a resourced-constrained prioritization problem of immeasurable order, where information and communication fatigue strains consistent prioritization. US military and political leaders can only make a decision once, under severe uncertainty. To that end, kinetic arguments make results easier to calculate, and since kinetic operations are more easily quantifiable, they are more justifiable. This is the rationale behind the overall preference for kinetic solutions and the pursuit of reduced uncertainty via bigger data.[11] It has not been enough.