Late in August 1609, the Italian astronomer Galileo Galilei wrote excitedly to his brother-in-law, relating the fast-moving events of that summer. A few weeks earlier, Galileo had heard rumours that a spyglass had been invented in Flanders (now part of Belgium). He quickly produced an improved version, setting off a new wave of rumours. Soon, the Venetian senate called on him to demonstrate his device. Galileo boasted to his family about the “numerous gentlemen and senators” who had “scaled the stairs in the highest campaniles in Venice to observe at sea sails and vessels so far away that … two hours or more were required before they could be seen without my spyglass.” The senate voted immediately to grant Galileo an appointment for life at the University of Padua in Italy, with an annual salary of 1,000 florins — back when 1,000 florins really meant something1.

Galileo was only getting started. Turning his new telescope towards the heavens, he discovered (among other things) four moons orbiting Jupiter. Craftily, he named them the Medicean Stars in honour of Cosimo II de’ Medici, the grand duke of Tuscany. The gambit worked: within a year of that letter about his Venetian success, Galileo had landed an even larger salary (and shed his teaching duties) as the official natural philosopher of the Medici court in Florence2.

150 years of Nature — an anniversary collection

Galileo had a knack for convincing government officials and courtly patrons to support his research. Tracing his exploits, as he darted from one benefactor to the next, we might recognize glimmers of today’s enterprising scientists. A good 250 years after Galileo’s time, however, a rather different relationship between government and science began to take hold.

Just when astronomer Norman Lockyer was founding Nature in 1869, major shifts in the government–science nexus were unfolding across many parts of the world.

Empire building

During the middle decades of the nineteenth century, the British Empire swelled to include about one-quarter of Earth’s land and to hold dominion over nearly one-quarter of its population. At this time, several prominent British politicians — including former and future prime ministers — sought to boost the fortunes of science and technology. In the 1840s, Robert Peel, Benjamin Disraeli, William Gladstone and others donated funds from their own coffers to help found the Royal College of Chemistry, convinced that focused research in this field would benefit the nation and its imperial ambitions. By the 1860s, many researchers were hard at work formalizing such arrangements. Construction began on a spate of laboratories at universities throughout the United Kingdom, each built on the promise that precision measurements of physical quantities could advance fundamental scientific understanding and spur industrial development.

Electrification, telegraphy, the expansion of railways and large-scale production of steel were the signature developments of an era often called the second industrial revolution, which began around 1870. Each demanded standard units and measures. New synergies emerged as leading researchers, including James Clerk Maxwell and William Thomson (later Lord Kelvin), plied their understanding of electromagnetism and thermodynamics as members of high-level government commissions, aiming to tackle the challenges of transatlantic communications, electrical standards, ocean navigation and steam power3.

Peer review: Troubled from the start

In some ways, the British were playing catch-up. Since the mid-nineteenth century, local universities throughout the German-speaking states had been recruiting academic talent in contests for prestige — latter-day Galileos were snatched up by government-funded institutions. The pattern escalated rapidly after the Prussian defeat of France and the establishment of a unified Germany early in 1871. Under a centralized education ministry, and with even grander ambitions for rapid industrialization, the German government invested heavily in academic research across the natural sciences4.

Even amid such support, however, leading industrialists such as Werner von Siemens feared that Germany was losing its edge. Concerted lobbying led to the establishment of a new government-funded institution in 1887: the Physikalisch-Technische Reichsanstalt in Berlin. Headed by the physicist Hermann von Helmholtz, its mandate was to accelerate work at the intersection of basic science, applied research and industrial development. Within a few years, pioneering efforts there to evaluate competing proposals for large-scale street lighting — which required careful measurements of the radiation output from various devices — yielded such precise recordings of the spectrum of blackbody radiation that prevailing physical theories could no longer accommodate the data. Inspired, physicist Max Planck reluctantly broke with Maxwell’s electromagnetic theory and took his first, tentative steps towards quantum theory5.

Meanwhile, a different war with Prussia triggered significant changes in government and science to the east, when the Austro-Hungarian empire formed in 1867. Very quickly, the imperial authorities launched epic efforts in meteorology and climatology. The aim was to create extended institutional networks that might foster a new, common sense of purpose across the hotchpotch of local legal, religious and linguistic traditions. Universities, museums and other government-supported institutions began to collect and standardize weather recordings, with a goal of understanding how local patterns related to larger-scale phenomena. The imperative to unify the far-flung empire catalysed cutting-edge research on such modern-sounding concepts as regional interactions and interdependencies across scales from microclimates to continents6.

By that time, Tsar Alexander II in Russia was busy pursuing a modernization project of his own. Beginning in 1861, he issued a series of proclamations that came to be known as the Great Reforms. Emancipating the serfs was followed quickly by overhaul of the state-run universities, as well as changes to regional governments and the judicial system. The vast bureaucracy that was created meant new opportunities for ambitious intellectuals, including chemist Dmitrii Mendeleev. After two years of study in Heidelberg, Germany, Mendeleev returned to his native St Petersburg in 1861 to teach chemistry at the local university. He published his now-famous version of the periodic table of the elements in 1869, the same year that Nature was launched.

The catalogue that made metrics, and changed science

The next steps in Mendeleev’s remarkable career are emblematic of the expanded roles of science and technology in the era. Before long, he was consulting for the Ministry of Finance and the Russian Navy, ultimately serving as director of the country’s Chief Bureau of Weights and Measures, in which capacity he helped to introduce the metric system in Russia. Much like Otto von Bismarck and other nation-builders in Germany, Tsar Alexander II was eager to bolster industrial development throughout his country. Central to those efforts was investing heavily in precision metrology; the tsar found eager and skilful natural scientists such as Mendeleev to help7.

In the same decade, Japan underwent enormous changes, too. The Meiji Restoration of 1868 marked a period of opening up for the formerly isolated country. The emperor’s Charter Oath proclaimed that: “Knowledge shall be sought all over the world, and thereby the foundations of imperial rule shall be strengthened.” The government began investing in manufacturing and other industrial reforms. It instituted new public schools and funded fellowships to send students abroad to study advances in science. The central government brought senior scientists from other countries — such as Britain and the United States — to Japan to build up training in state-funded facilities. Here, too, leaders began to prioritize government-sponsored research institutions as part of the modern state-building effort8.

Enter the United States

The United States remained a stubborn outlier. The timing was far from promising for new investment. The bloodiest conflict in US history sputtered to an end in 1865, punctuated by the assassination of President Abraham Lincoln. (More US soldiers died during the civil war of 1861–65 than during the First and Second World Wars and the wars in Korea, Vietnam, Afghanistan and Iraq combined.) Support for scientific research and institutions at the federal level remained scarce until the end of the nineteenth century. Indeed, several leading policymakers were scandalized by the nation’s comparative lack of scientific and technical preparation during the First World War.

Efforts by reformers in the United States to shore up government support for research was stymied by the long-standing US tradition that education should remain the province of state and local authorities, rather than the federal government. Across the United States, individual colleges and universities gradually placed greater emphasis on original research and built up infrastructure for laboratories. But the impact remained uneven at best. As late as 1927, when the young physicist Isidor Rabi travelled to Germany to study quantum theory, he found that university libraries tended to order one full year’s worth of US journal the Physical Review at a time. There seemed to be no reason to receive copies with any greater frequency, given their undistinguished contents9. Science was even largely ignored in the grips of the Great Depression of the 1930s, when the federal government centralized so many other things under President Franklin D. Roosevelt’s New Deal.

US students protest in 1969 over links between university scientists and the military.Credit: Joyce Dopkeen/The Boston Globe/Getty

Only in the early 1940s, amid emergency wartime mobilization, did the US federal government undertake large-scale support for research and development. Radar, nuclear weapons, the proximity fuse and dozens of other military projects required billions of dollars and close coordination between abstract studies and practical development.

The effectiveness of the wartime arrangements impressed politicians, military planners and university administrators alike. When peace came, they scrambled to build a new infrastructure that could maintain the war-forged relationships. Budgets across the physical sciences and engineering in the United States continued to rise thereafter, sourced almost entirely from the federal government. In 1949, 96% of all funding in the United States for basic research in the physical sciences came from defence-related federal agencies. By 1954 — four years after the founding of the civilian US National Science Foundation — that proportion had risen to 98%10.

Thereafter, policymakers in the United States found new reasons to support research: it helped to meet domestic goals for industrial development and military defence, and was a key element in international relations. Federal investment in scientific institutions across war-ravaged Europe — so the thinking went — might fend off scientists’ flirtations with communism in countries such as France, Italy and Greece. Major reforms of the Japanese university system under US occupation after the Second World War likewise helped to spread the US model. Spending on science became an investment in hearts and minds11,12.

From blackboards to bombs

In the United States, the steady federal investment drove an unprecedented growth in scientific research and infrastructure. More young people were trained in the natural sciences during the 25 years after the end of the Second World War than had been trained in total throughout all of previous human history. The US government developed a national laboratory system and supported a broad spectrum of research at universities, most of it with little direct connection to military projects. The expenditures were often justified in terms of broader ‘preparedness’: creating a large pool of trained personnel who would be available to work on focused military projects should the cold war ever turn hot13.

In the meantime, enterprising scientists made use of opportunities that came from close ties to military sponsors. US Navy concerns about submarine warfare drove intense exploration of the ocean floor. Geoscientists, capitalizing on new data and instruments, found compelling evidence for plate tectonics14. Similarly, physicists consulting on classified missile-defence projects spurred the development of new areas of study, such as non-linear optics15.

Diversified portfolios

That ‘new normal’ held for about a quarter of a century. Just as Nature marked its 100th anniversary in 1969, military auditors in the United States released a lengthy analysis, dubbed Project Hindsight. It argued that the federal defence agencies had received a poor return on their investment in open-ended science. That year, Democratic Senator Michael Mansfield (Montana) — who would soon become the longest-serving majority leader of the senate in US history — introduced a last-minute amendment to the federal Military Authorization Act of 1970. It stipulated that no funds from the Department of Defense could be used “to carry out any research project or study” that did not have “a direct and apparent relationship to a specific military function”.

On university campuses across the country, debate over the government’s role in supporting scientific research became even more raucous. Amid the escalation of the Vietnam War, scientists and students grappled with the proper place of defence spending in higher education. At Columbia University in New York City and the University of Wisconsin–Madison, radicals targeted military-funded research laboratories with explosives. On many other campuses, police resorted to tear gas and billy clubs to disperse angry protesters16.

The imperial roots of climate science

During the 1970s and 1980s, scientists forged partnerships with private industries as well as philanthropies. These relationships were accelerated by steep cuts in federal spending on defence and education in the United States and in many other parts of the world. Biotechnology and nanotechnology emerged in those years, buoyed by systems of support that were different from the government spending that had underwritten research in nuclear physics after the Second World War17.

Recent, hybrid patterns of support still depend heavily on central-government funding — just consider how closely scientists follow each year’s appropriation cycle in the US Congress and elsewhere. But support for research today is rarely sustained by the kind of saturation model that had seemed so natural early in the nuclear age. Fewer than 20 countries currently invest more than 2% of their gross domestic product in research and development, according to data from the Organisation for Economic Co-operation and Development and the World Bank. In several of those countries, meanwhile, the nature of government support has shifted, often prioritizing projects with short-term goals and practical applications over longer-scale inquiries.

When Lockyer was sending the first issue of Nature off to press, many elements of the modern scientific enterprise were being forged across Britain, the European continent and parts of Asia. But to fully grasp the range of monetary relationships that scientists now navigate — scouring today’s equivalents of the Venetian senate for funds, while courting private donors in Kavli Institutes and Simons Foundation centres that are no less sparkling than a Medici palace — we would do well to keep Galileo in mind.