The porter of English brewers, first made in 1722, was a dark strong beer related to modern stouts. It cost less than ordinary beer, kept longer, was more robust and needed less care in handling. London porter breweries became industrialists of a type different from any seen before anywhere. In the 1820s, English brewers created beers designed to survive the long sea voyage to Britain's colonies in Asia, especially India, the “jewel of the crown” of the British Empire. These highly hopped ales were less bitter and paler in color compared to porter and become known as India Pale Ales.

The village of Hoegaarden in Flanders was a small enclave able to evade supervision by any authority and grew into a significant exporter of beer. Brussels producers took advantage of the growing urban population. However, such successes constituted anomalies; the general picture in eighteenth century Europe was one of a retreat of brewing and consumption of beer. This changed in the nineteenth century when an entirely new industry was born, using steam engines and mechanized processes. Britain was the undisputed technological leader during the first stages of the Industrial Revolution, also in brewing. As author Richard W. Unger says:

“Scientific advances in eighteenth-century England (such as the use of the thermometer) were the start of a long series of developments which came to fruition in Bavaria, Austria, Denmark and later in Holland, Brabant, and England in the late nineteenth century. Mechanization and the use of steam engines to help with the heavy work was followed by the introduction of refrigeration, which made possible control of the environment in breweries. The developments came at the same time as research on yeast which, combined with other advances, made it possible to produce a consistent and reliable pilsner beer of high quality and competitive price. With an improved product which brewers could distribute along ever improving transportation networks and the invasion of the brewery by chemists who put the process of making beer more than ever on a scientific basis there was to be another age of prosperity with beer production and consumption spreading throughout the entire year and throughout the entire world.”

Developments in the making of wine and beer were intimately related to other scientific and industrial advances of the era. In the 1750s the Scottish physician Joseph Black studied a gas he called “fixed air,” which we know as carbon dioxide (CO2), one of the first gases to be described as a substance distinct from “air.” The Englishman Joseph Priestly, the discoverer of oxygen, began his own experiments involving “airs” in Leeds, where he lived close to a brewery. The air immediately above the surface of the beer brew fermenting in the vats had recently been identified as Black’s fixed air. John Gribbin explains in his book The Scientists :

“Priestley saw that he had a ready-made laboratory in which he could experiment with large quantities of this gas. He found that it formed a layer roughly nine inches to a foot (23-30 centimetres) deep above the fermenting liquor, and that although a burning candle placed in this layer was extinguished, the smoke stayed there. By adding smoke to the carbon dioxide layer, Priestley made it visible, so that waves on its surface (the boundary between the carbon dioxide and ordinary air) could be observed, and it could be seen flowing over the side of the vessel and falling to the floor. Priestley experimented with dissolving fixed air from the vats in water, and found that by sloshing water backwards and forwards from one vessel to another in the fixed air for a few minutes, he could produce a pleasant sparkling drink. In the early 1770s, partly as a result of an (unsuccessful) attempt to find a convenient preventative for scurvy, Priestley refined this technique by obtaining carbon dioxide from chalk using sulphuric acid, and then dissolving the gas in water under pressure. This led to a craze for ‘soda water’, which spread across Europe.”

Joseph Priestley had invented artificially carbonated water in 1767 at a brewery in Leeds, England. In 1771 the Swedish chemist Torbern Bergman, too, developed a process to make carbonated water, but carbonated soft drinks, known as soda or soda pop, were created in the nineteenth century. The Hungarian inventor Ányos Jedlik (1800-1895) promoted consumable soda-water, and John S. Pemberton (1831-1888) invented Coca-Cola in the USA in 1886.

The great English natural philosopher Michael Faraday is perhaps best known for his groundbreaking investigations of electricity and electromagnetism, but he was also Humphry Davy’s scientific assistant and did important chemical work on his own. Faraday managed to liquefy many of the gases known in the 1820s, among them ammonia in 1823, but some gases such as oxygen, nitrogen and hydrogen still proved too difficult at that time. These gases were eventually liquefied by other Europeans in the generations that followed.

The field of low-temperature physics, or cryogenics, emerged during the nineteenth century as an extension of the European chemical and electrochemical revolution. The proliferation of academic research laboratories and their move into fields required relatively expensive apparatus. “Cryogenics demonstrates the interpenetration of industry and science in the second industrial revolution – in particular, the budding refrigeration industry, which had emerged as a rival to natural ice in the late nineteenth century, especially for brewing lager beer and shipping meat to Europe from Argentina and New Zealand.” Academic physicists instigated their own low-temperature programs, which again led to the discovery of such physical phenomena as superconductivity and superfluidity in the twentieth century.

In the late 1700s, leading European scholars learned that when a gas is cooled, its volume is reduced by a predictable amount. Pressurizing a gas by forcibly squeezing its molecules closer together reduces its volume. The first person to liquefy a substance that normally (i.e., in temperatures we experience daily) exists as a gas was Gaspard Monge, a French scholar primarily remembered for his achievements in mathematics, especially descriptive geometry, but who did work in physics and chemistry as well. Monge produced liquid sulfur dioxide in 1784, but most gases were not liquefied until the mid-1800s. In the 1840s the Irish physical chemist Thomas Andrews (1813-1885) suggested that every gas has a critical temperature above which it cannot be liquefied even under greater pressure. His concept of critical temperature led to a breakthrough in the liquefaction of many so-called permanent gases.

The English natural philosopher James Prescott Joule was a scientific brewer whose father had made good money from making beer. At an early age, James Joule was given a laboratory adjacent to the brewery premises. Here is a quote from his biography in the excellent book The Oxford Guide to the History of Physics and Astronomy, edited by John L. Heilbron:

“Joule was educated at home and by the natural philosopher John Dalton. As the son of the wealthiest brewing family in Manchester, England, he had the opportunity to choose his profession freely….The brewery and Manchester industry in general made an ideal environment for studying the most current problems in science and technology. The new forces of electricity and magnetism then enjoyed the attention of most people interested in natural philosophy…. Expressing phenomena numerically had become a habit of Joule’s when he worked in the brewing world….The experiments drew on the thermometric skills he had acquired in the brewery, and invoked a close collaboration with the local instrument maker and natural philosopher John Benjamin Dancer. Joule and Dancer produced the most precise working mercury thermometer available at the time. Their contemporaries still used air thermometers….Joule was characterized as a ‘gentleman specialist’ for having established the mechanical equivalent of heat through exact measurement, but his related reflections on the dynamical nature of heat and its significance for thermodynamics carried little weight before he began his collaboration with William Thomson. Thomson made ‘Joule’s constant’ (the ratio of mechanical work to heat) the building block of the science of energy.”

Few physicists understood or took seriously what Joule argued since he was a brewer by profession and an amateur scientist. Fortunately, the young William Thomson, often known as Lord Kelvin, realized the importance of his work and collaborated with Joule for years on studies of the relationship between work and heat. Joule collaborated with Lord Kelvin on developing a thermodynamic (absolute) temperature scale, the Kelvin scale from 1848, where absolute zero (0 K) constitutes the absence of all thermal energy. The magnitude of the degree Celsius, from the centigrade scale named after the Swedish astronomer Anders Celsius and normally used in everyday life, is exactly equal to that of the kelvin, but 0 K is − 273.15 °C.

The Frenchman Sadi Carnot in 1824 published a theoretical account of steam engines whose importance was not fully grasped until some years later. The First Law of Thermodynamics, which states that energy can neither be created nor destroyed, was enunciated in 1842 by the German physician and physicist Julius Robert Mayer. He related mechanical energy to thermal energy. Mayer’s original studies were carried out whilst he was employed as a ship’s doctor in the Dutch East Indies in 1840. He noticed the difference in the color of venous blood when taken in tropical conditions as opposed to when it was taken in colder climates. Local physicians informed him that the bright red color was typical of tropical conditions. The consumption of oxygen required to maintain body temperature is less there than in colder countries. Mayer understood that by burning the same amount of food, the body could produce different proportions of heat and work, but the sum of the two had to be constant.

Although their starting points were very different, Joule and Mayer are generally regarded as the co-discoverers of the principle of energy conservation, which constitutes a fundamental part of all branches of physics and physical chemistry. The German scholar Hermann von Helmholtz placed the principle on a better mathematical basis in 1847 when he clearly stated the “conservation of energy” as a principle applicable to all natural phenomena. The German physicist Rudolf Clausius, building on Carnot's work, introduced the concept of entropy and stated the ideas of the Second Law of Thermodynamics, which stipulates that the total entropy of any isolated thermodynamic system always increases over time. The German physical chemist Walther Nernst in the early 1900s formulated the Third Law of Thermodynamics, which basically says that it is impossible to reach absolute zero of temperature.

The principle of vacuum refrigerators is based on the fact that water in a sealed container can be made to boil if the pressure is reduced (the “boiling point” of 100 degrees Celsius refers to the situation when the external pressure equals one atmosphere; water can be made to boil at lower temperatures on a mountain top). The heat necessary for evaporation is taken from the water itself. Reducing the pressure further lowers the temperature until freezing-point is reached and ice is formed. The Scottish scholar and chemist William Cullen (1710-1790) gave one of the first documented public demonstrations of artificial refrigeration, and the United States inventor Oliver Evans (1755-1819) designed, but did not build, a refrigeration machine which ran on vapor in 1805. I. Hornsey writes in his history of beer and brewing:

“The earliest machine of this type was constructed in 1755, by Dr William Cullen, who produced the vacuum necessary purely by means of a pump. Then, in 1810, Sir John Leslie combined a vessel containing a strong sulphuric acid solution along with the air pump, the acid acting as an absorbent for water vapour in the air. This principle was taken up and elaborated upon by E.C. Carré, who in 1860 invented a machine that used ammonia as the volatile liquid instead of water….The first compression machine was manufactured by John Hague in 1834, from designs by the inventor, Jacob Perkins, who took out the original patents, and recommended that ether was used as the volatile agent. Although Hague’s machine can be regarded as the archetype for all ‘modern’ refrigerators, it never really got past the development stage, and it was left to the Australian, James Harrison, of Geelong, Victoria, to finalise the practicalities and produce a working version, which he did in 1856. By 1859, Harrison’s equipment was being manufactured commercially in New South Wales, and the first of them (which used ether as the refrigerating agent) came to Britain in 1861.”

Once the reliability of compression machines had been established, British breweries took up the idea of acquiring ice-machines and refrigerators. Until the final decades of the nineteenth century, the only way to reduce the temperature of a liquid had been natural ice. The new methods of refrigeration meant that brewing could be carried out during the summer months, too. This was a small revolution, and commercial refrigeration was primarily directed at breweries in the 1870s. The early refrigeration systems used very large volumes of cooling water. In 1879 a transatlantic liner brought the first load of mechanically-chilled American beef to Britain whilst the first batch of frozen meat was imported into England from Australia.

In 1853, Joule and Thomson showed that when compressed air, and certain other gases, held at temperatures between 0 and 100 °C are allowed to expand through a porous plug or valve, the temperature falls. The heat effect on the expansion of gases is known as the Joule-Thomson effect. In 1895 the German Carl von Linde (1842-1934), an engineering professor at the Technische Hochschule in Munich where he had students like Rudolf Diesel and had developed a practical refrigerator in 1876, patented an efficient apparatus in which the Joule-Thomson effect was applied to the liquefaction of gases. As scholar Kostas Gavroglu says:

“The development of thermodynamics, especially James Prescott Joule’s and William Thomson’s proofs that the temperature of a gas dropped when it expanded very quickly, provided the necessary background for the investigation and the understanding of the properties of matter in the very cold. Thomas Andrews’s experiments determining the critical point – the temperature at which a gas whose pressure is increased at constant volume liquefies – and Johannes Diderik van der Waals’s discussion of the continuity of the gaseous and liquid states brought further insights into the characteristics of very cold fluids. The nineteenth century saw remarkable developments in the large-scale production of cold, especially through the development of the vapor compression process that led to different types of refrigeration machines and refrigeration processes. The plentiful availability of artificial cold transformed the preservation, circulation, and consumption of food. By the end of the nineteenth century the Linde Company had sold about 2600 gas liquefiers: 1406 were used in breweries, 403 for cooling land stores for meat and provisions, 204 for cooling ships’ holds for transportation of meat and food, 220 for ice making, 73 in dairies for butter making, 64 in chemical factories, 15 in sugar refining [and some] for other purposes.”

By the late nineteenth century, breweries were the largest users of commercial refrigeration units, though some still relied on harvested ice. In the early twentieth century, standards were reached for home refrigerators, the construction of trains and ships with large refrigerators, the installation of special refrigerators in slaughterhouses, the design of new hotels with air cooling systems etc. This revolution in refrigeration happened parallel to another revolution in transportation, and the combination of the two was to have global consequences.

Before 1830 in England, George and Robert Stephenson built a sequence of locomotives which established the basis for a new generation of engines with tube boilers and horizontal cylinders. There was a meteoric growth of railroads throughout the Western world and beyond after this, when Stephenson’s Rocket had showed the tremendous potential of the new innovation. Railroad construction in turn stimulated growth of ironworks and engineering workshops. Transcontinental railroads played a major role in opening up the American West for settlement and agriculture, and the characteristic pioneer railroads on the prairies usually had a line of telegraph poles alongside the track. Railroads were important in Canada and the USA, but also in South America and in Asia. In Argentina they were used to settle parts of the country with agricultural potential and gave ranchers access to ports and distant markets.

The paddle wheel had been known in China for centuries, but Asians never invented engines. The American engineer Robert Fulton (1765-1815), who had heard of James Watt's steam engine on a visit to England, introduced the first commercially successful steamboat in 1807, a paddle steamer operating on the Hudson River. I general, however, the first steamships were really sailing ships with auxiliary engines. It was not until the mid-nineteenth century that steamships primarily used the steam engine. The idea of the screw propeller had been proposed in 1753 by the mathematician Daniel Bernoulli, known for his theoretical work on fluid mechanics, and the Frenchman Frédéric Sauvage (1786-1857) demonstrated its capacities in the 1830s. Propellers were further improved by the Swedish-born inventor John Ericsson (1803-1889) and the Englishman Francis Smith (1808-1874) in the late 1830s.

In the second half of the nineteenth century, the construction of ships shifted from wood to iron hulls, and again to steel from the late 1870s on. Iron ships could be made of almost any size. The brilliant English engineer Isambard Brunel (1806-1859) designed the first large iron, propeller-driven transatlantic steamships, the Great Western and the Great Britain. The Great Eastern, launched in 1858, was the largest ship built during the nineteenth century. Ships grew larger and more powerful, which sharply reduced international transportation costs. The steam ship had great advantages in speed and flexibility. Around 1800, letter-writers in England could expect to wait up to two years for an answer to a letter to Calcutta, India. By the end of the century, the journey could be made in a few weeks. The technology for building wooden sailing ships, too, improved greatly during this time period, but in the end it was impossible for such ships to keep up with the competition from the motorized ships.

The new transportation and cooling techniques for the first time made it possible to export bulky goods such as grain and meat, or for that matter wine and beer, not just to other countries but to other continents. The United States and Canada, Australia and New Zealand, the Argentine and Uruguay were soon to show that they could offer food at cheaper prices than Europe herself. Historian J. M. Roberts writes in The New Penguin History of the World:

“The American plains, the huge stretches of pasture in the South American pampas and the temperate regions of Australasia provided vast areas for the growing of grain and the raising of livestock. The second was a revolution in transport which made them exploitable for the first time. Steam-driven railways and ships came into service in increasing numbers from the 1860s. These quickly brought down transport costs and did so all the faster as lower prices bred growing demand. Thus further profits were generated to be put into more capital investment on the ranges and prairies of the New World. On a smaller scale the same phenomenon was at work inside Europe, too. From the 1870s the eastern European and German farmers began to see that they had a competitor in Russian grain, able to reach the growing cities much more cheaply once railways were built in Poland and western Russia and steamships could bring it from Black Sea ports. By 1900 the context in which European farmers worked, whether they knew it or not, was the whole world; the price of Chilean guano or New Zealand lamb could already settle what went on in their local markets.”

Better transportation also led to more migration from Europe. Before 1800, there was little European emigration except from the British Isles, but this now changed. Roberts again:

“After [1800], something like sixty million Europeans went overseas, and this tide began to flow strongly in the 1830s. In the nineteenth century most of it went to North America, and then to Latin America (especially Argentina and Brazil), to Australia and South Africa. At the same time a concealed European emigration was also occurring across land within the Russian empire, which occupied one-sixth of the world’s land surface and which had vast spaces to draw migrants in Siberia. The peak of European emigration overseas actually came on the eve of the First World War, in 1913, when over a million and a half left Europe; over a third of these were Italians, nearly 400,000 were British and 200,000 Spanish. Fifty years earlier, Italians figured only to a minor degree, Germans and Scandinavians loomed much larger. All the time, the British Isles contributed a steady flow; between 1880 and 1910 eight and a half million Britons went overseas (the Italian figure for this period was just over six million). The greatest number of British emigrants went to the United States (about 65 per cent of them between 1815 and 1900), but large numbers went also to the self-governing colonies.”

Unfortunately, faster and more extensive communications increases the risk of spreading diseases from one region of the world to another. The grape phylloxera is a pest native to North America which had for generations made it difficult to transplant European vines to this region, although the reason for this was not properly understood. The local vines are naturally resistant to it. The louse did not survive the weeks at sea onboard the sailing ships, but the speed of the new steamships brought phylloxera to Europe in the 1860s. It caused tremendous devastation among European vineyards for decades, but the problem was eventually overcome by grafting resistant American rootstocks onto Old World vines. One of the positive side effects of this disaster was the growing importance of science in winemaking.

While refrigeration for commercial purposes had been introduced in the nineteenth century, its potential was fully realized in the twentieth century. The Swedish engineering students Baltzar von Platen and Carl Munters invented the gas absorption refrigerator in 1922. The first refrigerator to see widespread use was the General Electric “Monitor-Top” refrigerator from 1927. Introduction of home freezer units occurred in the United States in the 1940s, and frozen foods began to make the transition from luxury to necessity in the Western world during the second half of the twentieth century.

In the USA in 1930, Charles Seabrook and his brothers began experimenting with the freezing of vegetables. Their partnership with Clarence Birdseye founded the frozen food industry. Birdseye had observed how Eskimos in the Arctic use ice to quickly freeze freshly-caught fish straight through. When the frozen fish were thawed and eaten there was relatively little difference in taste. He concluded that it was rapid freezing in the extremely low temperatures that made the food retain some of its freshness – a procedure called flash freezing. Unlike the Eskimos, Westerners could also create artificial ice even in warmer climates. Birdseye’s company began leasing refrigerated boxcars to transport frozen foods by rail in 1944.

The American chemical engineer Thomas Midgley Jr. (1889-1944) while working for the company General Motors discovered that adding tetraethyllead (TEL) to gasoline prevented internal combustion engines from “knocking.” Unfortunately, as a side effect the addition of lead resulted in the release of huge amounts of toxic lead into the atmosphere, which caused serious health problems. Early refrigeration units used nasty and dangerous chemicals such as sulfur dioxide and ammonia. In 1930 Midgley discovered dichlorodifluoromethane, a chlorinated fluorocarbon (CFC) which he dubbed Freon. It soon became widely used it refrigerators. The negative effect of CFCs upon the ozone layer became widely known from the 1970s on, after which their use was phased out. One historian has stated that Midgley “had more impact on the atmosphere than any other single organism in Earth's history.”

It is difficult to say exactly when ice cream was “invented.” Various cultures have used natural ice in combination with fruits or berries for thousands of years. Cold desserts were made in pre-industrial Europe. Nevertheless, the widespread consumption of ice cream as we think of it in the twenty-first century had to await the development of artificial refrigeration. Italians, with their usual talent for combining great food with commercial skills, played an important part in both the development and spread of ice cream. While this is certainly appreciated by lovers of ice cream, the use of refrigerators in our kitchens makes it possible to keep fruits and vegetables edible longer and to store fish and meats safely for prolonged periods of time. Refrigeration has improved nutrition for millions of people around the world.