Here we utilize a hydrological model applied for the period 1916–2014 (all data are for water years if not specified otherwise) to evaluate the spatial and temporal signature of the Millennium Drought in the CRB. Along with a baseline simulation forced by gridded observations, we perform a T‐detrend experiment, in which we remove the long‐term temperature trend from the model forcings, to investigate the role of the warming on streamflow declines both over the long term and during the recent drought. We analyze runoff in each of 20 subbasins of the CRB, which allows us to study spatial variations in runoff generation and anomalies. We also analyze the historical 1953–1968 drought in an attempt to shed light on how the hydrologic response to climate variations has changed in recent decades and during the Millennium Drought in particular. Finally, we dissect the 2017 April–July streamflow forecast to understand the role of late winter and early spring precipitation and temperature in the substantial seasonal forecast reductions that occurred as water year 2017 progressed.

Several studies have investigated the effects of ongoing warming on the flow of the Colorado River. Barnett and Pierce ( 2009 ) concluded that anthropogenic climate change would reduce CRB runoff by 10%–30% by 2050. Reynolds et al. ( 2015 ) predicted that minimum streamflows will decline as warming of the basin continues. Woodhouse et al. ( 2016 ) reported an increase in the frequency of warm years with low streamflow since 1988. McCabe et al. ( 2017 ) found that increases in temperature since the late 1980s have decreased runoff generation efficiency, reducing streamflows by 7%. Udall and Overpeck ( 2017 ) similarly found temperature‐induced streamflow decreases of approximately 6% during 2000–2014 and projected large midcentury temperature‐induced declines of 20% or more should precipitation not change.

A pronounced warming trend across the Colorado River Basin (CRB) since the 1970s (Dawadi & Ahmad, 2012 ) has further contributed to the post‐2000 imbalance between CRB runoff and water demand. Vano et al. ( 2012 ) evaluated the temperature sensitivity (annual average streamflow change per 1 °C temperature change) and found that the average sensitivity of annual runoff at Lees Ferry was around −5%/°C), suggesting that warming over the last ~50 years may account for a 5–10% reduction in annual streamflow over that period.

The Colorado River is heavily regulated, mostly by Glen Canyon Dam (Lake Powell) and Hoover Dam (Lake Mead), with combined reservoir storage capacity of 67.5 km 2 (54.7 maf). The importance of these reservoirs, which can store close to 4 times the natural annual flow at Lees Ferry, AZ, has become especially evident during the so‐called Millennium Drought, which began about 2000. This drought has coincided with increases in water demand (Rajagopalan et al., 2009 ), which resulted in Lake Mead reaching its lowest level on record in October 2016. Lakes Mead and Powell dropped precipitously from 2000 to 2004 due to very low flows (71%, 74%, 41%, 71%, and 64% of average, respectively) and have not recovered due to continued high demands equal to inflows and a lack of high flow years. Indeed, only four of the last 18 years have had above average river discharge, limiting reservoir refill opportunities.

The Colorado River is the largest river in the southwestern U.S. It is the source of drinking water for many of the Colorado River Basin's 40 million people and provides irrigation water to ~13,000 km 2 of crops in the U.S. and Mexico (Cohen et al., 2013 ). It is a lifeline for the population and agricultural economy of parts of seven U.S. states (WY, UT, CO, NV, NM, AZ, and CA) and the Mexican states of Sonora and Baja California. The river's naturalized streamflow (see section 2.2 for discussion of naturalized streamflows) at Imperial Dam (the downstreammost long‐term gauging station) has averaged about 20.7 km 3 /yr (16.8 maf/yr) over the last century, approximately 90% of which is generated in the Upper Colorado River Basin (McCabe & Wolock, 2007 ), defined as the~289,000 km 2 of drainage area upstream of the U.S. Geological Survey stream gauge at Lees Ferry, AZ (USGS 09380000). Snowpack stored in the high‐elevation Rocky Mountain headwater basins contributes about 70% of the annual streamflow (Christensen et al., 2004 ).

Annual time series and linear regression trend plots for Colorado River Basin above Lees Ferry: (a) annual (naturalized) runoff, (b) annual precipitation, and (c) annual average surface temperature calculated by VIC. Changes are calculated relative to the starting value of the fit. Note that precipitation (b) is from an extended version of the Hamlet and Lettenmaier () data set at 1/16th degree spatial resolution, while temperature (c) is calculated from VIC and is approximately 0.4 °C warmer than the Hamlet and Lettenmaier input temperature.

Figure 2 a shows the annual time series of naturalized streamflow (NFL) and VIC simulations at Lees Ferry, AZ. Both the annual naturalized streamflows and VIC simulations ( r 2 = 0.75) and their trends over the period of record (NFL: −3.3 km 3 /yr, VIC: −3.4 km 3 /yr) are similar, suggesting that the VIC model provides a plausible representation of natural conditions (i.e., those responding primarily to climate forcings) and long‐term hydrologic change in the basin. Hereafter, we mainly focus on VIC results in our analysis of UCRB subbasin long‐term (1916–2014) trends and comparison between the 1953–1968 and the Millennium drought. The annual precipitation and average temperature (calculated by VIC as noted in section 2.1 ) time series plots are also presented in Figure 2 .

Table 1 summarizes the long‐term runoff contribution percentages from nine major subbasins at which naturalized streamflows are available and for which we also produced VIC simulations. The runoff contribution percentages from the model and naturalized flows generally are in good agreement. The Upper Basin (UCRB; defined as the drainage area above Lees Ferry, AZ) produces more than 90% of the flow at Imperial Dam. Therefore, we mainly focus on the UCRB here, acknowledging unusual Lower Basin (LCRB) conditions when noteworthy.

It is important to note that our analysis excludes the Gila River given its distinct hydrological and legal characteristics. The Gila River joins the Colorado River below Imperial Dam just upstream of the U.S. border with Mexico and in recent years has been mostly dry at its mouth due to upstream uses by Arizona. Since 1964, the U.S. Supreme Court has excluded it from administration under the Colorado River Compact. Although the Gila is an important basin, its absence from this study is logical given its unique status.

We performed our analyses for the Colorado River above Imperial Dam, as well as for the 20 subbasins delimited by USGS WaterWatchgauges (see Figure 1 ), which are a subset of the 29 naturalized streamflow points noted above. The river channel network data set we used is from Wu et al. ( 2012 ), based on which we determined the masks for each of the 20 subbasins. The Wu et al. subbasins are similar to, but slightly different from, the more familiar six‐digit Hydrologic Unit Codes normally used in the basin. Detailed information about each subbasin is reported in the supporting information.

To evaluate our model simulation results, we used naturalized streamflow data for the Colorado River produced by the U.S. Bureau of Reclamation (USBR); see https://www.usbr.gov/lc/region/g4000/NaturalFlow/current.html for details. The naturalized streamflows are derived from USGS historical streamflow observations by a process of adjustments that compensate for anthropogenic effects including consumptive uses of water, reservoir storage, transbasin diversions, and other effects (see USBR, 1983 ). The naturalized streamflow data sets are produced for 29 well‐distributed tributary stations across the CRB (as well as the main stem) for the period 1906 through 2015. Others (Prairie & Callejo, 2005 ) have noted that USBR has improved the quality of the naturalized flow data set after 1971 and the estimates may be somewhat better after that time.

The VIC model simulates surface hydrological processes with parameterizations of subgrid vegetation, soil variability, and topography and has provided plausible representations of CRB surface water conditions in the above‐referenced studies. We forced the model with an updated version of the Hamlet and Lettenmaier (hereafter H&L) data set (Hamlet & Lettenmaier, 2005 ) at 1/16° resolution for the period water years 1916–2014. We chose the H&L data set because its long‐term variability is indexed to the U.S. Historical Climatology Network (HCN; Easterling et al., 1996 ) stations in the region, which have been carefully quality controlled for effects that could otherwise result in spurious trends, such as station moves and instrument changes (e.g., the shift to maximum‐minimum temperature system temperature sensors in the 1980s). As described in Hamlet and Lettenmaier ( 2005 ), the H&L data set uses HCN station data to constrain decadal variability (and hence long‐term trends), hence is in our view most appropriate for exploration of the causes of century‐scale streamflow declines over our study period 1916–2014.

Similar to other land surface models, the fundamental water balance equation in VIC can be summarized as Runoff (RO) = Precipitation (P) − Evapotranspiration (ET) − changes in Soil Moisture (ΔSM) − changes in Snow Water Equivalent (ΔSWE). Groundwater is not represented in the version of VIC we used; Rosenberg et al. ( 2013 ) found that inclusion of a parameterization of groundwater had little effect on the model's streamflow simulations in the CRB. It is important to note that VIC represents snowpack sublimation within its winter ET. Sublimation is sparsely measured but nonetheless is important to some aspects of our study (Andreadis et al., 2009 ); we describe the model's performance with respect to sublimation in section 4.2 . The VIC model has been successfully applied previously in a number of hydrological studies over the CRB and the U.S. Southwest (Christensen et al., 2004 ; Christensen & Lettenmaier, 2007 ; Mote et al., 2005 , 2018 ; Vano et al., 2012 , 2014 ).

The Variable Infiltration Capacity (VIC) model is a physically based, semidistributed hydrological model, which represents the land surface water and energy budgets over a grid mesh (here 1/16th degree spatial resolution) and routes runoff through a prescribed river network to produce streamflow estimates at specified river nodes (Liang &Lettenmaier, 1994 ). We applied the model at a daily time step, using what is termed full‐energy balance mode, meaning that the model iteratively solves the surface energy budget by estimating the effective surface temperature at each time step. Therefore, the daily average surface temperature produced by VIC is not the average of the forcing temperatures, that is, 0.5*(daily maximum + daily minimum). Unless stated otherwise, the temperatures we report here are outputs from the VIC simulations.

3 Results

3.1 Basin‐Wide Trend Analysis Table 2 summarizes long‐term linear (regression) trends for the UCRB for four hydrological variables (precipitation, evapotranspiration, runoff, and 1 April snow water equivalent) from the baseline VIC simulation and the temperature‐detrended (T‐detrend) simulation. We also computed trends using the nonparametric Theil‐Sen slope estimator (Sen, 1968; Theil, 1950) and found that they generally are in close agreement (Table S1). Therefore, we refer to the linear trends hereafter for convenience. The T‐detrend simulation uses the same forcings as the baseline, except that annual linear trends in the daily temperature maxima and minima are removed. We also disaggregated summer season (April–September) and winter season (October–March) for each variable (all summers and winters mentioned hereafter are so defined). Table 2. UCRB Annual and Seasonal Changes in Water Balance Variables Over Water Years 1916–2014 in km3/yr (km3 for SWE) and Percentages Relative to the Starting Value of the Fit P T ET ET‐D RO RO‐D SWE SWE‐D Annual 1.5 (1.4%) 1.8 4.2 (4.7%) 2.3 (2.6%) −3.4 (−16.5%) −1.6 (−7.7%) −9.1 (−39.0%) −5.6 (−23.9%) Winter −0.1 (−0.2%) 1.9 4.9 (30.5%) 2.9 (18.0%) 0.4 (10.4%) 0.4 (9.0%) Na Na Summer 1.6 (3.0%) 1.7 −0.8 (−1.1%) −0.6 (−0.8%) −3.8 (−23.3%) −1.9 (−11.9%) Na Na Over the simulation period 1916–2014 the UCRB annual precipitation increased by +1.5 km3 (1.4%), whereas winter precipitation, which is the main source for 1st April snow water equivalent and streamflow in the spring and summer, had only a very small (not statistically significant) negative trend (long‐term ΔP is −0.1 km3, −0.2%). In our baseline simulation, the long‐term linear change of annual runoff (ΔRO) in the UCRB is −3.4 km3 (−16.5%) and long‐term change in annual evapotranspiration (ΔET) is +4.2 km3 (+4.7%). The 1st April SWE decreased significantly (ΔSWE −9.1 km3, −39.0%), which reduces warm season streamflow from the Upper Basin, as evidenced by summer RO decreases (−3.8 km3, −23.3%) even given a positive trend in summer precipitation (ΔP summer is +1.6 km3). As summer RO makes up more than 3/4 of the annual RO in the UCRB, the long‐term annual ΔRO is negative as noted above, although summer RO decreases are slightly compensated by increasing winter RO (ΔRO winter + 0.4 km3, 10.4%). We performed the T‐detrend simulation using the same precipitation as the baseline simulation but with the temperature trend removed from the forcing data set on a grid cell by grid cell basis. In this no‐warming‐trend scenario, the long‐term decreasing trend in annual runoff is reduced to −1.6 km3(−7.7%), from −3.4 km3 but not eliminated. It suggests that 53% (−1.8/ −3.4) of the annual runoff trend is attributable to the annual warming temperature. The increase in ET in the T‐detrend simulation is smaller by 1.9 km3 (baseline: +4.2 km3, T‐detrend: +2.3 km3), which explains the increase in runoff (1.8 km3) to within 0.1 km3. The numbers in Table 2 also show that the effects of the temperature trend on winter RO (baseline: +0.4 km3, T‐detrend: +0.4 km3) and summer ET (baseline: −0.8 km3, T‐detrend: −0.6 km3) are small. Increasing temperatures cause a decrease in summer RO (baseline: −3.8 km3, T‐detrend: −1.9 km3) and an increase in annual ET (baseline: +4.2 km3, T‐detrend: +2.3 km3) that comes mostly in the winter (baseline: +4.9 km3, T‐detrend: +2.9 km3). On a percentage basis, both of these increasing winter trends in ET are substantial over the 1906–2014 period: a 30% increase in the baseline ET and an 18% increase in the T‐detrend simulation ET. The summer ET changes of −1.1% and −0.8% are comparatively small. It is worth noting that the long‐term trend in UCRB winter ET is positive in the T‐detrend simulation even given no significant trend in winter precipitation. The positive trend in winter ET is mainly caused by increased snow sublimation. Although sublimation is strongly controlled by surface temperature, other factors also contribute as well (see section 4). The remaining −1.6 km3 (−7.7%) decrease in RO in the T‐detrend simulation is curious given the increasing summer precipitation (ΔP summer + 1.6 km3, 3.0%) and negligible winter precipitation change (ΔP winter − 0.1 km3, −0.2%). In addition, although the SWE anomaly in the T‐detrend simulation is less compared with that of the baseline simulation (baseline: −9.1 km3, T‐detrend: −5.6 km3), the long‐term 1906–2014 SWE trend is still negative in the T‐detrend simulation (−23.9%). Winter ΔET in the T‐detrend simulation is only +2.9 km3 as reported in Table 2, which cannot explain all of the SWE anomaly. One possible answer is that while the overall basin‐wide precipitation changes over time are small, precipitation declines in the most productive basins while increasing in the less productive basins. We explore the effects of such spatial variations below.

3.2 Subbasin Conditions Figure 1 shows that there are four subbasins in the upper CRB (denoted by red numbers) that produce most of the UCRB runoff: the Yampa River, Colorado River near Cameo, Gunnison River, and San Juan River (from north to south, respectively). The most productive subbasin is the Colorado River near Cameo (USGS 09095500) in the northeastern part of the UCRB. This subbasin produces almost one quarter of the total naturalized runoff of the UCRB. It contains not only the mainstem but also several large tributaries, including the Eagle, the Roaring Fork, and the Blue. A little more than 30% of the UCRB flow is produced by the other three subbasins, and in total, about 55.5% of the total discharge of the UCRB is attributable to these four tributaries. Below, we discuss the nature of the long‐term changes in these critical subbasins. Figure 4 shows annual precipitation, ET, and runoff changes for all subbasins over the 1916–2014 study period. The top row is extracted from our baseline simulation, and the bottom row is from the T‐detrend simulation. We note that although some subbasins appear similar between baseline and original maps, the numbers are more different than they might appear by visual inspection of the maps (Tables S3 and S4). We calculated the changes relative to the initial value of each linear fit, shown in Table 2. Figure 4a shows a noteworthy east‐west dipole in the precipitation changes over time in the UCRB. In the UCRB, precipitation decreases have occurred mainly in the high runoff generating northeastern part of the basin, while several subbasins in the northwestern part of UCRB show long‐term annual precipitation increases. Precipitation declines have also occurred in the LCRB where little runoff occurs. These decreases in precipitation led to declines in ET and little change in subbasin runoff (Figures 4c and 4f), with negligible impact on total basin runoff (e.g., at Imperial Dam). There are two subbasins in the northeastern part of the UCRB, which have relatively large annual precipitation decreases of −2.3 km3 (Colorado River above Cameo) and −0.7 km3 (Gunnison River) with a combined runoff decrease of −2.9 km3 (supporting information). These are the same highly productive subbasins shown in Figure 1 and are a major driver of the overall annual runoff decline. Four basins in the northwestern part of UCRB with increasing precipitation (the Green River downstream portion along with its San Rafael River and Duchesne River tributaries; colored in deeper blues in Figure 4) have partially offset these long‐term runoff declines by about 1.0 km3. Figures 5 and 6 are similar to Figure 4 but for winter (October–March) and summer (April–September), respectively. Winter runoff changes are small for both the baseline and T‐detrend simulations, as most runoff occurs during the summer season. Although the total precipitation amounts are similar during warm and cold seasons, winter precipitation is much more important to the UCRB's runoff. Summer precipitation mainly contributes to ET rather than runoff, as high summer temperatures lead to large ET, especially at lower elevations. Winter precipitation in mountain headwater regions accumulates as snowpack and contributes mostly to RO rather than ET, when it melts. The 1 April SWE trend plots for all the subbasins (Figures 3a and 3b) show that the four highly productive subbasins (Yampa River, Colorado River near Cameo, Gunnison River, and San Juan River) in the northeastern part of the basin that contribute much of the runoff losses in the UCRB have all experienced substantial SWE decreases. Those subbasins are also snow‐dominant regions as indicated by Figure 3c. Figure 5a shows that winter precipitation has declined in all of the northeast UCRB subbasins except for the San Juan River, which shows a positive winter precipitation trend. Nonetheless, both SWE (Figure 3a) and annual RO (Figure 4c) in the San Juan Basin are decreasing. The reason is that winter ET has increased substantially: ΔP winter is +0.4 km3, while long‐term ΔET winter is +1.1 km3, with SWE decreasing by −0.7 km3, or −30.1%. Declines in SWE in the other three basins, all of which experience declines in precipitation, are more severe and range from −46% to −49%. The increased winter ET, along with reductions in precipitation in these basins, explains the strongly decreasing SWE and substantially explain the declines in subbasin runoff. Figure 3 Open in figure viewer PowerPoint Spatial plots of 1 April SWE trends for (a) baseline simulation and (b) T‐detrend simulation over each subbasin. The changes over 1916–2014 are calculated relative to starting value of the linear regressions. (c) Long‐term average 1 April SWE. Figure 4 Open in figure viewer PowerPoint Spatial changes of (a) annual precipitation from gridded observations, (b) ET, and (c) runoff from baseline VIC simulation over 1916–2014 for CRB above Imperial Dam. Changes are calculated relative to the starting value of linear fits. Panels (d)–(f) are the same as (a)–(c), but variables are extracted from the T‐detrend simulation. Panels (a) and (d) are identical. Figure 5 Open in figure viewer PowerPoint Same as Figure 4 but for winter (October–March). As noted above, 53% (1.8 of 3.4) of the long‐term runoff trend in the UCRB is related to warming temperatures. To dissect the remaining −1.6 km3 (−47%) in the T‐detrend simulation, we performed a P‐ and T‐detrend experiment, in which we removed both the temperature and winter precipitation trend from the original input data set. Importantly, under this experiment the northeast UCRB basins see increased winter precipitation, while the northwest basins see decreased winter precipitation relative to the baseline and T‐detrend simulations. Note, also, that we do not modify the summer precipitation, which increased over the study period. Under the P&T‐detrend simulation, the UCRB's long‐term runoff losses become −0.6 km3 (1.0 km3 less than the pure T‐detrend and 2.8 km3 less than the baseline). The residual −0.6 km3 loss over the 1916–2014 period is attributable to increased winter ET. Section 4.2 below evaluates why ET winter shows a positive trend given no P trend and no T trend. The total runoff decline of −3.4 km3 can thus be attributed to warming (−1.8 km3), insufficient P in the northeast part of CRB (−1.0 km3), and increased winter ET (−0.6 km3). Summer precipitation and summer ET trend spatial plots (Figures 6a and 6d versus 6b and 6e) show similar patterns for both the baseline and T‐detrend simulations: negative trends have occurred over the LCRB and the eastern UCRB, while some increases have occurred in the northwestern headwaters. The spatial patterns confirm that in the summer increases in precipitation drive increases in ET, while decreases in precipitation drive decreases in ET over both the LCRB and UCRB when surface air temperatures are relatively high. Figure 6 Open in figure viewer PowerPoint Same as Figure 4 but for summer (April–September). In the UCRB the baseline simulation April–September runoff (Figure 6c), which constitutes almost three quarters of the CRB annual total, shows spatial patterns similar to the SWE spatial plots in Figure 3. Taken together, the figures show where water is stored as snow in the UCRB during winter in the cold, high‐elevation headwater regions and how SWE then contributes to runoff in the following spring and summer. Over the last century, warming temperatures, reduced winter precipitation in the most productive mountain subbasins in the UCRB, and slight increases in winter ET (Figure 5b) lead to reduced SWE and consequently reduced runoff. In the LCRB, the annual precipitation, ET, and runoff plots show mostly P decreases, ET increases, and small RO changes (Figure 4). In winter, some P increases occur in the NW portion, ET increases everywhere except in the south, and RO has little change (Figure 5). Summer shows decreasing P, increasing ET, and little RO change (Figure 6).