Clearly, the controlled use of fire to cook food was an extremely important element in the biological and social evolution of early humans, whether it started 400,000 or 2 million years ago. The lack of physical evidence suggests early humans did little to modify the control and use of fire for cooking for hundreds of thousands of years, which is quite surprising, given that they developed fairly elaborate tools for hunting during this time, as well as creating some of the first examples of cave art about 64,000 years ago. Physical evidence shows that cooking food on hot stones may have been the only adaptation during the earliest phases of cooking.

Then, about 30,000 years ago, “earth ovens” were developed in central Europe. These were large pits dug in the ground and lined with stones. The pits were filled with hot coals and ashes to heat the stones; food, presumably wrapped in leaves, was placed on top of the ashes; everything was covered with earth; and the food was allowed to roast very slowly. The bones of many types of animals, including large mammoths, have been found in and around ancient earth ovens. This was clearly an improvement over rapidly roasting meat by fire, as slow cooking gives time for the collagen in tough connective tissue to break down to gelatin; this process takes at least several hours, and often much longer, depending on the age of the animal and where the meat comes from in the animal. The shoulders and hindquarters of animals are involved in more muscular action and thus contain more connective tissue than the tenderloin near the ribs. Breaking down tough connective tissue makes the meat easier to chew and digest. Like today’s barbecue methods, cooking meat slowly in earth ovens made it very tender and flavorful.

After dry roasting with fire and heating on hot stones, the next true advance in very early cooking technology appears to have been the development of wet cooking, in which food is boiled in water. Boiling food would certainly be an advantage when cooking starchy root tubers and rendering fat from meat. Many archeologists believe the smaller earth ovens lined with hot stones were used to boil water in the pit for cooking meat or root vegetables as early as 30,000 years ago (during the Upper Paleolithic period). Others believe it is likely that water was first boiled for cooking in perishable containers, either over the fire or directly on hot ashes or stones, well before this time.

Unfortunately, no direct archeological evidence has survived to support this conclusion. Yet we know that even a flammable container can be heated above an open flame as long as there is liquid in the container to remove the heat as the liquid evaporates. Thus containers made of bark or wood or animal hides could have been used for boiling food well before the Upper Paleolithic period. No physical evidence of sophisticated utensils for cooking food appears until about 20,000 years ago, when the first pieces of fired clay pottery appear. Using sensitive chemical methods, scientists have determined that shards of pottery found in Japan contain fatty acids from marine sources such as fish and shellfish. These heat-resistant pots may have been used to boil seafood.

The development of simple clay ovens did not occur until at least 10,000 years later. If cooking has had such a profound effect on the evolution of humans, why is there little evidence from earlier periods of the development of more sophisticated methods of cooking than simply roasting in a hot pit or boiling in water with hot stones?

Jacob Bronowski may have answered that question in his enlightening book The Ascent of Man. The life of early nomads, such as the hunter-gathers who existed for several million years or more, was a constant search for food. They were always on the move, following the wild herds. “Every night is the end of a day like the last, and every morning will be the beginning of a journey like the day before,” he wrote. It was a matter of survival. There simply was no time for them to innovate and create new methods of cooking. Being constantly on the move, they couldn’t pack up and carry heavy cooking utensils every day, even if they had invented them. Then, about 10,000 years before the last ice age ended, creativity and innovation finally began to flourish in spite of the restrictions of nomadic life. Early humans were finding that food was becoming more abundant due to warming weather, so they could gather it more easily without needing to move constantly.

*

With the end of the last ice age and the beginning of the Neolithic period, about 12,000 years ago, everything changed. Everything! It was the dawn of the agricultural revolution, when wandering nomads began to settle and turn into villagers. What made this possible? The discovery that seeds from new varieties of wild grasses that emerged after the end of the ice age, such as emmer wheat and two-row barley, could be gathered, saved, planted, and harvested the following season. This occurred first in an area known as the Fertile Crescent (Jordan, Syria, Lebanon, Iraq, Israel, and part of Iran). Enough food could now be harvested in 3 weeks to last an entire year!

The change from a nomadic life to a sedentary life in more secure settlements was critical.

Being able to harvest large quantities of food at one time meant these early farmers could no longer move from place to place; they had to build immovable structures for storing and protecting all the food, and this resulted in the creation of permanent settlements. The agricultural revolution then spread to other parts of the world over several thousand years.

Thanks to the pioneering research of the Russian scientist Nikolai Vavilov in the 1930s and the American scientist Robert Braidwood in the 1940s, we now know that over several thousand years people living in seven independent regions of the world domesticated crops and animals indigenous to that region. Unfortunately, Vavilov’s studies were prematurely ended when he was imprisoned in 1940 by the Stalinist government for his revolutionary views on evolution.

As the ice age was coming to an end around 12,000 years ago, early humans were harvesting wild wheat and barley in quantity in the Fertile Crescent, but there was no evidence of domesticated plants and animals. By domesticated, I mean plants and animals deliberately raised for food by humans rather than wild plants and animals gathered in the forests and fields. Then within a period of roughly 300 years, between 10,000 and 9,700 years ago, the first evidence of domesticated plants and animals began to appear in the southern Jordan Valley around the ancient settlement of Jericho.

In this relatively brief time period, the seeds of plants like wheat and barley became larger while the bones of animals became smaller. That’s how archeologists in the field can tell the difference—and it makes sense. As early humans began to select seeds to plant, they chose the larger seeds, which were storing more of the nutrients required for faster growth. The resulting crops grew faster to outcompete the wild weeds and provided higher yields—and in turn produced still larger seeds.

These early humans also selected wheat plants with terminal clusters of seeds that retained the kernels during harvest instead of allowing them to scatter in the wind like the wild varieties. The rachis, the short stalk that holds the seed to the plant, became shorter and thicker with time. DNA analysis confirms that the physical differences observed between domesticated and wild seeds originate in the plant’s genome. All these changes occurred as a result of human selection of plants with more desirable traits. These are the first plants to be genetically modified through human intervention. Similarly, domesticated goats and sheep were selected to be more docile and adaptable to living in a confined pen and feeding off the scraps of food left by their keepers. Thus they became smaller. These physical changes in domesticated plants and animals began to take shape as humans started to produce their own food.

The development of new foods and methods of cooking in the few thousand years following the emergence of agriculture illustrates how important this period was for the advancement of humans. The change from a nomadic life to a sedentary life in more secure settlements was critical, as it allowed humans to make significant achievements in technology and other areas. Within a few thousand years, small farming villages grew into large permanent settlements and then small cities. Jericho is perhaps the oldest permanent settlement, providing an accurate record of agricultural development between 10,000 and 9,700 years ago. Hunter-gatherers first settled there around 11,000 years ago in order to be near a constant source of water, a spring-fed oasis. Archeological excavations of the oldest buried sections of Jericho, which cover an area of a little less than ¼ acre (0.1 hectares), did not reveal any signs of domesticated seeds or animal bones.

By 9,700 years ago, the first domesticated seeds of emmer wheat and barley began to appear in higher levels of soil, and the earliest farming settlement had grown to an area of about 6 acres (2.5 hectares) with perhaps 300 people living in mud brick houses. By 8,000 years ago, Jericho was home to a permanent agricultural settlement of approximately 3,000 people occupying an area of 8–10 acres (3.2–4 hectares). About this same time, emmer wheat hybridized with a wild grass to produce bread wheat, which contained higher levels of the gluten-forming proteins required for making leavened bread. Wheat had finally emerged in the form in which it is still grown and used today around much of the world.

__________________________________

Excerpted from Cook, Taste, Learn: How the Evolution of Science Transformed the Art of Cooking © 2019 Guy Crosby. Used by arrangement with Columbia University Press. All rights reserved.