In a good year, the management of water resources in the American West is contentious. When a drought hits, most everyone feels it, and this year is certainly no exception. The notion of sustainability in water-strapped places isn’t much more complicated than balancing a checking account. And the budget projections aren’t exactly encouraging.

The last thing this situation needs is a decrease on the supply side. Unfortunately, precipitation in the Southwestern US is projected to decline as a result of anthropogenic climate change. Double unfortunately, the last century isn’t even a very good baseline for the region’s climate without climate change. Records from things like tree rings show drier periods in the past. A recent study led by Cornell’s Toby Ault attempts to pull this all together to improve our understanding of future drought risk in the region.

The worst US droughts of the 20th century were the 1930s “Dust Bowl” in the central US and the 1950s in the Southwest. In the past, the Southwest has averaged one or two of these almost-decade-long droughts per century, but there have also been droughts longer than anything in the historical record—droughts lasting several decades.

In the 1150s, for example, reconstructions tell us that the Southwest was in the midst of almost 25 years of below-average precipitation. For a solid decade, the Colorado River averaged about 85 percent of its normal flow. Arizona is allocated about 15 percent of the Colorado’s water, which now rarely makes it to the Gulf of California before drying up. That’s a decade without an Arizona’s share of water.

Climate models simulate the year-to-year variability in precipitation pretty realistically, but compared to those reconstructions, they seem to simulate fewer of these less-common multidecadal extremes. So while they tell us that the Southwest should see a drying trend as the climate warms, they may underestimate the possibility of longer droughts.

For a better forecast, the researchers applied the statistics of past variability from several reconstructions and the historical record to the model-projected trends for the rest of this century. Using that information, they generated a thousand Monte Carlo simulations to see how frequently droughts of a given length should occur.

Climate model simulations alone give a less than 50 percent chance of a decade long drought in the American Southwest this century. The Monte Carlo analysis, however, put the odds in the neighborhood of 80 percent—and exceeding 90 percent in some areas.

The probability of a 35-year “megadrought” was 10 to 50 percent, depending on which scenario of future emissions is used. That would be the worst drought the region has seen in at least 2,000 years. Even the odds of an unthinkable 50 year drought were 5 to 10 percent in the business-as-usual emissions scenario.

The researchers actually characterize these numbers as conservative because they only assessed precipitation, while rising temperatures will also boost evaporation and influence droughts.

This study isn’t about predictions derived from complete knowledge. (It certainly won’t be the last study on the topic.) It’s about describing risk, which is fundamentally an intelligent expression of uncertainty. Water managers need to understand these risks in order to effectively plan for the uncertain future and ensure that there’s enough water for the always-thirsty West—as well as other regions around the world that will face similar challenges.

Journal of Climate , 2014. DOI: 10.1175/JCLI-D-12-00282.1 (About DOIs).