How did we get from Google’s user search of just 20 years ago to an unprecedented form of capitalism that not just predicts, but shapes, user actions in the real world, to profit business customers, not users?

Got it: Pokémon GO mobile game in Berlin Sophia Kembowski · AFP · Getty

July 2016, a gruelling afternoon. David had directed hours of insurance testimony in a New Jersey courtroom where a power surge had knocked out the air conditioning system. Finally home, the cool air hit him and for the first time all day he took a deep breath, then made himself a drink and headed for a long shower. The doorbell rang just as the water hit his aching back. He ran downstairs, opening the front door to a couple of teenagers waving their phones in his face: ‘Hey, you’ve got a Pokémon in your backyard. It’s ours! Okay if we go back there and catch it?’

‘A what?’ He had no idea what they were talking about. David’s doorbell rang four more times that evening, all strangers eager for access to his yard and disgruntled when he asked them to leave. They held up their phones, pointing and shouting as they scanned his house and garden for the augmented-reality creatures. They could see their Pokémon prey on their screens but only at the expense of everything else. The game seized the house and the world around it, reinterpreting it in a vast equivalency of GPS coordinates. Here was a new kind of commercial assertion: a for-profit declaration of eminent domain with ‘reality’ recast as unbounded blank space to be sweated for others’ enrichment. David wondered when would this end? What gave them the right? Whom did he call to make this stop?

Neither David nor the game players suspected that surveillance capitalism — an unprecedented, audacious economic logic — had brought them together.

In 1999, despite Google’s new world of searchable web pages, its growing computer science capabilities, and its glamorous venture backers, there was no reliable way to turn investors’ money into revenue. Users provided the raw material of behavioural data, harvested to improve speed, accuracy and relevance of search and to help build ancillary products such as translation. This balance of power made it financially risky and possibly counterproductive to charge users a fee for search services. Selling search results would have set a dangerous precedent for the firm, assigning a price to indexed information that Google’s web crawler had already taken from others without payment. Without a device like Apple’s iPod, there were no margins, no surplus, nothing to sell and turn into revenue.

Google had relegated advertising to steerage class: its AdWords team consisted of seven people, most of whom shared the founders’ general antipathy toward ads. This changed abruptly in April 2000, when the dot-com economy began its plunge into recession, and Silicon Valley unexpectedly became the epicentre of a financial earthquake. Google’s response to the emergency triggered the crucial mutation that turned AdWords, Google, the Internet and the nature of information capitalism toward a lucrative surveillance project.

User profile information

Emblematic of this, and of the emerging logic of accumulation that would define Google’s success, is a patent, ‘Generating User Information for Use in Targeted Advertising’, submitted in 2003 by three of its top computer scientists: ‘The present invention may involve novel methods, apparatus, message formats and/or data structures for determining user profile information and using such determined user profile information for ad serving’ (1). Google would no longer mine behavioural data strictly to improve user service but to read users’ minds for the purposes of matching ads to their interests, as deduced from collateral traces of online behaviours. New data sets were compiled based on user profile information (UPI) that would dramatically enhance the accuracy of these predictions.

The genius of Pokémon GO was to transform the actual game into a higher-order game of surveillance capitalism, a game about a game

Where would UPI come from? UPI, the inventors wrote, ‘may be inferred.’ Their new tools could create UPI by integrating and analysing a user’s search patterns, document inquiries and other signals of online behaviours, even when users did not directly provide that personal information. The inventors cautioned that UPI ‘can be determined (or updated or extended) even when no explicit information is given to the system.’

The scientists made it clear they were willing — and their inventions were able — to overcome the friction in users’ decision rights. Behavioural data, whose value had previously been used up on improving the quality of users’ search, now became the pivotal — and exclusive to Google — raw material for the construction of a dynamic online advertising marketplace. These behavioural data available for uses beyond service improvement constituted a surplus, and it was on the strength of thissurplus that the company would find its way to the ‘sustained and exponential profits’ necessary for survival.

Google’s invention revealed new capabilities to infer and deduce the thoughts, feelings, intentions and interests of individuals and groups with an automated extraction architecture operating as a one-way mirror (irrespective of a person’s awareness, knowledge, and consent) enabling privileged secret access to behavioural data. This extraction imperative drove the economies of scale in behavioural data capture that would constitute a world-historic competitive advantage in a new kind of prediction market where low-risk bets about the behaviour of individuals and groups are valued, bought and sold. The one-way mirror embodies the social relations of surveillance enforced by the vast asymmetries of knowledge and power that it produces.

AdWords quickly became so successful that it inspired significant expansion of the commercial surveillance logic. Advertisers demanded more clicks. The answer was to extend the model beyond Google’s search pages and convert the entire Internet into a canvas for Google’s targeted ads. Hal Varian, Google’s chief economist, said this meant turning Google’s new skills of data extraction and analysis towards the content of any web page or user action, employing Google’s expanding semantic analysis and artificial intelligence capabilities to squeeze meaning from them. Only then could Google accurately assess the content of a page and how users interacted with it. This content-targeted advertising based on Google’s patented methods was named AdSense. By 2004, AdSense had achieved a run rate of $1m a day and, by 2010, annual revenues of more than $10bn.

A convergence of behavioural surplus, data science, material infrastructure, computational power, algorithmic systems and automated platforms produced unprecedented relevance and billions of auctions. Click-through rates skyrocketed. Work on AdWords and AdSense became just as important as work on Search. With click-through rates as the measure of relevance, behavioural surplus was institutionalised into a new kind of commerce that depended upon online surveillance at scale.

During Google’s 2004 IPO, the world first learned of the financial success of this new market form. Google executive Sheryl Sandberg later became the Typhoid Mary of surveillance capitalism as she led Facebook’s transformation from a social networking site to an advertising behemoth. With Google in the lead, surveillance capitalism became the default model of information capitalism on the web, drawing competitors from every economic sector.

The behavioural surplus upon which Google’s fortune rests can be considered as surveillance assets, critical raw materials in the pursuit of surveillance revenues and their translation into surveillance capital. The logic of this capital accumulation is most accurately understood as surveillance capitalism, the foundational framework for a surveillance-based economic order: a surveillance economy. The big pattern is subordination and hierarchy, in which earlier reciprocities between the firm and its users are subordinated to the derivative project of users’ behavioural surplus captured for others’ aims. Users are no longer the subjects of value realisation. Nor are they, as some have insisted, the product of Google’s sales. Instead, they are the objects from which raw materials are extracted and expropriated to fabricate prediction in Google’s machine-learning factories which are then sold to its actual customers — the businesses that pay to play in new behavioural futures markets.

Three-day event: 15,000 fans attend Pokémon GO Safari Zone at Tottori Sand Dunes, Japan Asahi Shimbun · Getty

‘Your whole life will be searchable’

Douglas Edwards, Google’s first brand manager, recounts a 2001 session with the founders that probed answers to ‘What is Google?’ Larry Page ruminated: ‘If we did have a category, it would be personal information... The places you’ve seen. Communications... Sensors are really cheap... Storage is cheap. Cameras are cheap. People will generate enormous amounts of data... Everything you’ve ever heard or seen or experienced will become searchable. Your whole life will be searchable’ (2).

Page’s vision perfectly reflects the history of capitalism, taking things that live outside the market sphere and declaring their new life as commodities. In historian Karl Polanyi’s 1944 narrative of ‘the great transformation’ to a self-regulating market economy, he described the origins of this translation process in three crucial mental inventions he called commodity fictions: they were that human life could be subordinated to market dynamics and reborn as labour to be bought and sold; that nature could be translated into the market and reborn as land or real estate; and that exchange could be reborn as money.

Today’s owners of surveillance capital have declared a fourth fictional commodity expropriated from the experiential realities of human beings whose bodies, thoughts and feelings are as virginal as nature’s once-plentiful meadows and forests before they fell to the market dynamic. Human experience is subjugated to surveillance capitalism’s market mechanisms and reborn as behaviour, to be rendered into data, ready for fabrication into predictions that are bought and sold. The new market form declares that serving the genuine needs of people is less lucrative, so less important, than selling predictions of their behaviour.

This changed everything.

The first wave of prediction products depended upon surplus extracted at scale from the Internet to produce relevant online ads. The next wave was defined not just by quantity but by the quality of prediction products. In the race for higher degrees of certainty, it became clear that the best predictions would have to approximate observation. A second economic imperative, the prediction imperative, is the expression of these competitive forces, enlarging the complexity of surplus supply operations as economies of scale were joined by economies of scope and action.

From virtual to actual world

The shift toward economies of scope announces fresh aims: behavioural surplus must be vast, but also varied. These variations are developed in two dimensions. The first is the extension of extraction operations from the virtual world into the real world where we live our actual lives. Surveillance capitalists understood that their future wealth would depend upon new surplus supply routes from our bloodstreams to our beds, our commutes, to our refrigerators, parking spaces, living rooms... nothing was to be exempt.

Economies of scope proceed in a second even more private dimension: depth. The idea is that more predictive, and therefore more lucrative, behavioural surplus will be extracted from the most intimate patterns of the self. These operations are aimed at human personality, moods and emotions, lies and vulnerabilities. Every level of intimacy will have to be captured and flattened into data points for the conveyor belts that proceed toward manufactured certainty. Much of this is accomplished under the banner of personalisation, hiding the aggressive extraction of the private depths of everyday life.

Examples of products to render, monitor, record and communicate behavioural data proliferate, from smart vodka bottles to Internet-enabled rectal thermometers and everything in between. Consider the Sleep Number bed, with its ‘smart bed technology and sleep tracking’. The company also collects ‘biometric and sleep-related data about how You, a Child, and any person that uses the Bed slept, such as that person’s movement, positions, respiration, and heart rate while sleeping’, along with audio signals in the bedroom.

Our homes are targeted by surveillance capitalism, as competitors chase a $14.7bn market for smart-home devices in 2017, up from $6.8bn in 2016 and expected to reach more than $101bn by 2021. You may have encountered early absurdities: smart toothbrushes, light bulbs, coffee mugs, ovens, juicers, and utensils said to improve digestion. Others are grimmer: a home security camera with facial recognition; an alarm system that monitors unusual vibrations before a break‑in; indoor GPS locators; sensors that attach to any object to analyse movement and temperature and other variables; cyborg cockroaches to detect sound. Even the nursery is targeted as a source of fresh behavioural surplus.

The music to make them dance

As competition for revenues intensifies, surveillance capitalists have learned that economies of scope are not enough. Behavioural surplus must be vast and varied, but the surest way to predict behaviour is to interrupt and shape it at the source. Machine processes are configured to achieve these economies of action. Now the global digital architecture of connection and communication is commandeered for this new purpose. Interventions are designed to enhance certainty by nudging, tuning, herding, manipulating and modifying behaviour toward profitable outcomes. Methods may be as subtle as inserting a specific phrase into your Facebook newsfeed, timing the appearance of a BUY button on your phone, or directing you to GPS coordinates in search of Pokémons. They may be as violent as shutting down your car engine when your insurance payment is late. As one software developer told me: ‘We are learning how to write the music, and then we let the music make them dance. We can engineer the context around a particular behaviour and force change that way... We can tell the fridge “Hey, lock up because he shouldn’t be eating” or we tell the TV to shut off and make you get some sleep.’

As the prediction imperative pulls supply operations into the real world, surveillance revenues enthral product and service providers in established sectors. For example, automobile insurers are eager to implement telematics. They had long known that risk is highly correlated with driver behaviour and personality, but there was little they could do about it. A report by Deloitte’s Center for Financial Services counsels risk minimisation — a euphemism for guaranteed outcomes — through monitoring and enforcing policyholder behaviour in real time. Known as ‘behavioural underwriting’, it means that ‘Insurers can monitor policyholder behaviour directly ... by recording the times, locations and road conditions when they drive, whether they rapidly accelerate or drive at high or even excessive speeds, how hard they brake, as well as how rapidly they make turns and whether they use their turn signals’ (3). As certainty replaces uncertainty, premiums that once reflected the necessary unknowns of everyday life can now rise and fall by the millisecond, informed by precise knowledge of how fast you drive to work after a hectic morning caring for a sick child, or if you do wheelies in the parking lot behind the supermarket.

Telematics are not intended merely to know but also to do. They are muscular, they enforce. Behavioural underwriting promises to reduce risk through processes designed to modify behaviour towards maximum profitability. The analysis of behavioural surplus triggers punishments such as real-time rate hikes, financial penalties, curfews and engine lockdowns, or rewards such as rate discounts, coupons and gold stars to redeem for future benefits. The consultancy firm AT Kearney anticipates the Internet of Things ‘can enrich relationships’ to connect ‘more holistically’ with customers ‘to influence their behaviours’ (4).

Motorists ranked and monitored

Spireon describes itself as ‘the largest aftermarket vehicle telematics company’ and specialises in tracking and monitoring vehicles and drivers for lenders, insurers and fleet owners. Its Loan-Plus Collateral Management System pushes alerts to drivers when they have fallen behind in payments, remotely disables the vehicle when delinquency exceeds a predetermined period, and locates the vehicle for the repo man to recover.

Telematics announce new behavioural controls. The insurance company can set parameters for driving behaviour, from fastening the seat belt to rate of speed, idling times, braking and cornering, aggressive acceleration, harsh braking and excessive hours on the road. These parameters are translated into algorithms that continuously monitor, evaluate and rank the driver, calculations that translate into real-time rate adjustments. Surplus is also translated into prediction products for sale to advertisers, as the system calculates behavioural traits for advertisers to target by sending ads to the driver’s phone.

According to a patent held by Spireon’s top strategist, insurers can eliminate uncertainty by shaping behaviour. The idea is to continuously optimise the insurance rate based on monitoring the driver’s adherence to behavioural parameters defined by the insurer. The system translates its behavioural knowledge into power, assigning credits or imposing punishments. A second patent is even more explicit about triggers for punitive measures with algorithms that activate consequences when parameters are breached: ‘a violation algorithm’, ‘a curfew algorithm’, ‘a monitoring algorithm’, ‘an adherence algorithm’, ‘a credit algorithm’.

David opened his front door that evening unaware that he and the Pokémon hunters were participants in an experiment in economies of action. If they were the rats, then John Hanke was the man in the white lab coat. Hanke wanted to own the world by mapping it. He had founded Keyhole, a satellite mapping startup funded by the CIA that was later acquired by Google and rechristened Google Earth. Hanke became Google Maps’ product vice-president and its Street View boss. By 2010 Hanke had set up his own shop, Niantic Labs, within Google. His new aim was to develop parallel reality games that would track and herd people through the territories Street View had claimed for its maps. Niantic was the company behind Pokémon GO.

Pokémon GO is based on augmented reality and structured like a treasure hunt: the app is downloaded from Niantic, and GPS and smartphone cameras enable players to hunt virtual creatures. The figures appear on the screen as if located in real-life surroundings: an unsuspecting man’s backyard, a street, a park, a drugstore. The idea was that players should ‘go outside’ for ‘adventures on foot’ in the open spaces of cities, towns and suburbs.

Highest-grossing app in the US

Released in the US, Australia and New Zealand on 6 July 2016, Pokémon GO became the most downloaded and highest-grossing app in the US within a week, soon achieving as many active Android users as Twitter. Just six days after release, BuzzFeed reporter Joseph Bernstein advised Pokémon users to check the amount of data the app was collecting from their phones. The news site TechCrunch questioned ‘the long list of permissions the app requires’.

We can tell the fridge ‘Hey, lock up, he shouldn't be eating' or we tell the TV to shut off and make you get some sleep Software developer

By 13 July the rationale for the data capture came into focus. Hanke admitted to the Financial Times that in addition to ‘in‑app payments’ for game kit, ‘there is a second component to our business model at Niantic ... the concept of sponsored locations.’ This revenue stream had always been in the plan, as companies pay ‘to be locations within the virtual game board — the premise being that it is an inducement that drives foot traffic.’ These sponsors would be charged on a cost per visit basis, similar to the cost per click in Google’s search advertising.

‘Sponsored locations’ is a euphemism for Niantic’s behavioural futures markets; real-world revenues will increase in proportion to the company’s ability to match persons with locations, just as Google used surplus to target online ads to specific individuals. The elements and dynamics of the game, combined with its novel augmented-reality technology, prod and herd people across real-world terrains to spend their real-world money in the real-world commercial establishments of Niantic’s behavioural futures markets.

A living laboratory

At its zenith in the summer of 2016, Pokémon GO was a surveillance capitalist’s dream come true: a living laboratory for behaviour modification that fused scale, scope and action. Its genius was to transform the actual game into a higher-order game of surveillance capitalism, a game about a game. The players who took a city as their game board were themselves an unwitting board for this second and more consequential game, whose players were not among the enthusiasts waving their phones at David. They were Niantic’s actual customers: the entities that pay to play in the real world, lured by the promise of guaranteed outcomes; they vie for proximity to the cash that follows each smiling member of the herd. The Financial Times exulted that ‘speculation has surged over the game’s future power as a cash cow to retailers and other cravers of footfall.’

There can be no guarantee of outcomes without the power to make it so. This is the dark heart of surveillance capitalism that reimagines human beings through the lens of its own distinctive power, mediated by a global digital architecture repurposed as a vast and intricate means of behavioural modification. Surveillance capitalism thus announces a regressive age of autonomous capital and heteronomous individuals, when the possibilities of democratic flourishing and human fulfilment depend upon the reverse. What is this new power, and how does it remake human nature for the sake of its own lucrative certainties?