Quantification and control share a long history. As the Harvard Business Review put it in 2012, “You manage what you measure.” And in our age of Big Data, measurement is rife. Whether the GPS trackers measuring the micro-movements of Amazon factory workers or the standardised tests measuring the performance of six-year-old school children and their teachers, quantification is ubiquitous.

This is a source of great frustration for progressives today. But as Quinn Slobodian’s Globalists demonstrates, it was also a source of deep concern to the Geneva School neoliberals of the 1930s. Back then, Slobodian writes, Polish economist Michael Heilperin warned about the dangers of “pseudo-quantitative concepts” and criticised the use of “statistical constructions” to understand “the heterogeneous reality they are supposed to represent.”

By the 1940s, neoliberals had come to associate numbers with those who believed in state action. Leftists like Otto Neurath and Harold Laski, who imagined using statistical information to stabilise prices, were accused by the neoliberal Walter Lippmann of seeking a “planetary super-state” through “economic world-planning.” For neoliberals, such a nightmare scenario of quantified planning clearly contradicted Mises’ central idea that only the unhindered price mechanism could solve the economic ‘problem of order.’ Even more pointedly, Hayek insisted that the ‘knowledge’ which a society possessed was irredeemably dispersed amongst the populace and could not be ‘possessed’ by any individual or state authority. Thus, reality defies, he argued, any act of quantification and is only ‘known’ via the emergent behavior of individuals acting in a free market. Put simply, the idea that social and economic life could be adequately captured by statistics was antithetical to the Hayekian promise of the market as the ultimate information processor and ordering device.

Given this ambivalence to numbers, it is striking that many critical political economists see the quantification of the social realm as the epitome of neoliberal reason. Today’s management practices, forged on quantification, fit awkwardly with neoliberal theory’s antipathy to numbers.

The Rise of Managerial Planning

In response to this problem, we argue that we should look elsewhere to understand the rise of quantification. In May 1946, one year before Hayek called together the economists, lawyers, journalists and business people for what became the first meeting of the Mont Pèlerin Society (MPS), a different think tank released its first report.

Project RAND’s Preliminary Design of an Experimental World-Circling Spaceship was very different to the kind of anti-planning and libertarian tracts that were to come out of the MPS. The report explored the feasibility from an engineering perspective of launching a space satellite for the US Navy. For this report, RAND built on Operations Research techniques to develop what would become a mathematical approach to planning. Driven by revolutions in Digital Computing, the RAND projects explicitly aimed at modelling human decision making and social systems. It was the utopian planning that haunted the neoliberal theorists identified by Slobodian.

In recent publications, we have made the case for looking more closely at the innovations made by researchers at RAND. As we show, these people radically changed how to think about governance in ways which have proven deeply influential in the era of neoliberalism. It is this approach to governance which snowballed into the contemporary avalanche of numbers.

Managerial governance initially took the form of systems analysis – a new way to frame how strategic policy decisions are taken. Concretely, it meant turning policy making into an economic problem. Here, management decisions were to be based on making the best use of the resources at hand with the understanding that every option comes with opportunity costs.

Strategic decision making was made synonymous with optimisation itself. The basic premise of this approach was to use quantification techniques to determine what should be done. Where the scientific management of Taylor and Ford had been focused on finding the most efficient means to achieve a predetermined objective, the shift undertaken by RAND was to ambitiously use optimisation to determine these objectives in the first place.

The range of innovations that stems from this new approach are quite staggering and speak of the profound influence of this new paradigm of governance. RAND researchers, along with the research networks to which they were connected, established many of the key technologies that have come to be associated with managerial governance:

Decision theory based on data about performance

Mathematical formatting of decisions

Strategic framing of governance (partly associated with the turn to rational choice and game theory)

Emphasis placed on processing information and computing

Budgeting as a tool of decision making and cost-benefit analysis.

The importance placed on strategic optimisation fuelled a systematic process of social reporting as a means to empower managerial decision making. Social priorities would now be based on the idea that governance is a matter of making the most effective use of public resources; a task which neoliberals had emphasised should be left to the market.

Quantification was an integral aspect of managerial governance. Anything that was to be taken into consideration had to be assessed in quantitative terms so as to make it amenable to a practice of optimisation. Whereas Hayek saw social life as inherently unknowable by numbers; managerial governance mandated that life would only be actionable if quantified.

The Diffusion of the New Practices of Planning

From a military practice initially developed by a lone think tank, systems analysis came to redefine a wide range of governance activities. In 1958, the RAND economist Roland N. McKean estimated that opportunities for quantitative analysis could be identified for over three quarters of all federal spending. Three years later, McKean and Charles Hitch implemented the Planning, Programming, and Budgeting System (PPBS) in order to restructure the administration of the Department of Defense along RAND lines. In 1965, PPBS was extended beyond the military to social policy – driving governmental interventions in healthcare, education, housing, and urban development.

While the PPBS was itself shortlived, the technologies of governance it established were not. John Lindsay, mayor of New York 1966-1973, called in RAND researchers to radically overhaul city administration, policing, transport, and fire services. Internationally, PPBS was taken up under the banner of Rationalisation des choix budgétaires in France in 1968, Programme Analysis and Review (PAR) in Edward Heath’s Britain in 1971, and likewise informed the 1960s social reforms of Willy Brandt’s West Germany. In a similar spirit, British cybernetician Stafford Beer travelled to Allende’s Chile to develop Cybersyn – a radically ambitious project for a data-driven socialist planning system.

The 1980s neoliberal revolution was supposed to sweep this away. Yet a cursory look at contemporary governance reveals that, far from retreating, the quantified world continues to define us. The ubiquity of risk and audit, the proliferation of foresight and scenario planning, and the wide reliance on cost-benefit analysis in decision-making shows how the managerial hand of planning endures.

How was it that practices of planning – which were the main target of the neoliberal critique – ended up proliferating under neoliberalism? In our second post, we will examine how managerial planning was given a second lease of life in the 1980s to become a defining feature of governance.

Sahil Jai Dutta, Samuel Knafo, Richard Lane, Ian Lovering and Steffan Wyn-Jones

The set image displays RAND researcher Nancy Nimitz, September 1, 1958, Photo by Leonard McCombe/The LIFE Picture Collection/Getty Images