Layers of charcoal residue buried beneath the northern Montana prairie show that pre-Columbian indigenous hunters on the Great Plains once burned patches of grassland to stimulate new growth. This created a tempting feast for bison herds, which the hunters then used to lure the bison in for the kill. And that, archaeologists say, means that even relatively small, mobile groups of hunter-gatherers can have a bigger environmental impact than they’ve been given credit for.

It’s not the fall, it’s the sudden stop at the end

For a group of hunters on foot, like the ancestors of today’s Blackfeet people, one of the most efficient ways to take down large prey like the American bison is to simply chase a group of them off a cliff and then harvest the remains below. Various hunter-gatherer societies around the world have used versions of this tactic over the last several thousand years, leaving piles of animal bones (many with evidence of butchering or cracking to get at marrow) at the bases of ancient bluffs. It takes planning and coordination among many hunters—and a decent amount of luck.

In the uplands of north-central Montana, on what is today the Blackfeet Reservation, pre-Columbian hunters built mile-long stretches of rock cairns called drivelines, which hunters used to help them funnel buffalo herds from fertile grazing patches called gathering basins, toward the edge of a steep bluff overlooking a tributary of the Two Medicine River. At two different driveline sites, archaeologists have radiocarbon dated bison bones to between 900 and 1650 CE, with the majority of kills happening in the final 250 years of that period. (The sites are on a tributary flowing into the Two Medicine River from the north and another on a different tributary flowing in from the south.)

And at the same time, evidence suggests that those same hunters were burning patches of the prairie to spur the growth of fresh, tasty new grass in the gathering basins to lure in herds of hungry bison. Southern Methodist University archaeologist Christopher Roos and his colleagues, in a project designed by John Murray of the Blackfeet Tribal Historic Preservation Office, studied layers of sediment exposed in the riverbed walls of two tributaries of the Two Medicine River.

Each tributary would have had one of the drivelines and gathering basins in its drainage area, so those layers of sediment record what was happening in the gathering basin and along the driveline. At each site, the team, which included members of the Blackfeet Tribe, found between five and eight layers of charcoal residue, a sure sign of nearby prairie fires. These were radiocarbon dated to between 1100 and 1650 CE—the heyday of the bison jumps.

Burning grass, smoking gun

The prairie sometimes burns naturally, as lightning strikes can spark wildfires. This is common in years when the Great Plains have seen enough rainfall to make the grass grow well, creating ample fuel for burning. Wildfires scorched these lands long before people built drivelines to help them hunt bison, and wildfires are still a vital force on the modern prairie. But the layers of charcoal residue in the riverbed walls don’t look like the product of natural wildfire seasons.

Roos and his colleagues examined sediment layers going back at least 500 years before the first mass bison kills at either site, and they didn’t find any distinct layers of charcoal residue—just the usual traces of charcoal mixed in with normal layers of sediment. And in sediment deposited after 1650, the charcoal layers were also conspicuously absent. So there was a distinct class of fires that could lay down a substantial layer of charcoal, and these only happened during the years hunters were using the drivelines, which the archaeologists say points to a connection.

The most likely explanation is that hunters deliberately burned the gathering basins several months before a hunting season—this probably meant a spring burn to prepare for fall hunts, or perhaps a fall burn to prepare for spring hunts. Fire leads to new growth, something bison find hard to resist. On the modern prairie, ranchers practice controlled burns every year, or every few years, to promote healthy new grass for cattle grazing. It would have been a smart strategy for bison hunters; arranging a tempting grass buffet for the bison helped increase the odds that a herd would actually gather in the gathering basin, which made a successful hunt much more likely.

How should we define the Anthropocene?

The prairie fires, then, would have been precise, controlled burns in carefully chosen spots, not indiscriminate burning of a whole landscape. Even so, the fires had an impact on the prairie ecosystem.

“Burning of this kind in fescue prairies changes the composition of the grasses and forbs,” Roos told Ars. That doesn’t make the burned area more or less diverse than an unburned area, he explained. “[But] what it does do is create patchiness in the grassland with different compositions, thereby creating greater between-patch diversity. This can be especially important for small animals.”

That means that even a hunter-gatherer society with relatively low population density had a significant impact on the local ecosystem. (Early European estimates, from just after populations had mostly recovered from a 1781-1782 smallpox epidemic, put the average hunting band at about 150 people and a large winter camp at about 2,000 people.) That effect was significant enough, in fact, to find its way into the geological record—which is a key part of how scientists now define the geological epoch called the Anthropocene, in which human activity is a major force shaping Earth’s environment.

Several archaeological finds in recent years have suggested that the Anthropocene began earlier than we first suspected, dating back not to the Industrial Revolution but possibly to the earliest agricultural societies. This “Ancient Anthropocene” model suggests that when those early farmers cleared forests for fields, and when they used fires for cooking and warmth in much larger concentrations than before, they released enough carbon into the atmosphere to have an impact on early Holocene climates.

Of course, it’s not likely that hunters on the Great Plains, burning portions of the prairie to attract bison herds, would have released enough carbon to have a real impact on climate. But this is another sign that humans have been leaving a lasting mark on the environment for much longer than we realized. William Ruddiman, who first proposed the Ancient Anthropocene hypothesis, has also suggested that a more flexible definition of “anthropocene” may be more useful than pinning a firm start date onto an Anthropocene-with-a-capital-A geological epoch.

“My take on this is that our evidence of hunter-gatherer manipulation of their environment (including, potentially, carbon budgets and carbon cycling through fire use) provides greater weight to the ‘fuzzy’ and evolving definition of the anthropocene, rather than a fixed geologic unit of time,” Roos told Ars.

Lessons from the past

The telltale layers of charcoal also point to a complex, nuanced collaboration between humans and climate to shape how fires impacted the Great Plains several hundred years ago. Scientists who study wildfires often see climate and human activity as opposing forces. For instance, people may burn regularly when climatic forces would lead to few fires, which in turn may help reduce how much fuel is available to burn when an extreme fire season rolls around.

But on the Great Plains, according to Roos and his colleagues, bison hunters’ fires actually amplified the effects of climate on the prairie’s natural fire cycle. Natural wildfires are more likely, and generally larger, in seasons when there’s more grass to burn. And since rain makes the grass grow, wetter decades are, ironically, linked to more burning.

Of the 13 charcoal deposits in the study, six were associated with periods when the climate on the Great Plains took a wetter turn (on a scale of decades). The other seven were associated with shorter wet periods or with much slighter increases in rainfall. In the context of the bison jumps, it looks like the hunters who built and used the drivelines took the opportunity to burn their gathering basins whenever there was enough grass to burn. Thanks to human activity, relatively slight increases in average rainfall had a much bigger impact on the frequency of fires in the area than they would naturally have.

That understanding of how humans, even in relatively small groups, interacted with climate patterns to produce a pattern of fires that in turn shaped an ecosystem holds some lessons for managing grasslands today.

“I tend to think that the past offers general lessons about contemporary problems rather than specific lessons. One lesson from our work is that this landscape (fescue grasslands) can sustain pretty intensive fire use and that fire can be used to successfully manage grazing in this environment,” Roos told Ars. “This is consistent with some of the arguments coming out of Oklahoma State University, where they are arguing for patch burning to manage the grazing patterns of bison and cattle.”

Future studies might shed more light on how widespread the practice in North America actually was of burning gathering basins to lure bison herds.

“Late Holocene bison jumps are found more broadly than just the northwestern Plains. Was fire used as part of those hunting strategies, too?” said Roos. “What impact did this burning have on the fire regimes of the landscapes between clusters of driveline complexes? I mention food webs in the paper, and it would be really great to know what the consequences of this burning strategy were on other plant and animal species.”

PNAS, 2018. DOI: 10.10703/pnas.1805259115 (About DOIs).