Growing degree days/heat accumulation

Growing degree days (GDD) accumulated for the annual period averaged for the period 1900–2014 is presented in Fig. 1a. The definition of GDD we employed does not impose an upper limit to daily GDD, and hence would include the contribution of extreme heat days into the accumulated GDD magnitudes. Generally, the agricultural GDD (AGDD) follows a clear latitudinal pattern with increasing magnitudes as we move from north to south with some exceptions such as Rocky Mountains ranges, which are lower in AGDD magnitudes than their surroundings and Mojave and Sonoran deserts in the west, which are naturally higher in AGDD magnitudes than their surroundings due to high air temperatures. The maximum and minimum station-observed average AGDD were observed at Key West International Airport, Florida (5722 °C) and Dillon, Colorado (249 °C), respectively. Figure 2 presents the long-term average AGDD for different months of the year. On a national average basis, monthly GDD progresses from a minimum of 15 °C in January to a maximum of 423 °C in July and decreases thereafter. Site-specific values vary largely from the national average; nevertheless, they follow similar monthly trends.

Figure 1 Map showing (a) spatial distribution of long-term average annual accumulated growing degree days (AGDD); (b) temporal trends in annual accumulated growing degree days during the period 1900–2014 across CONUS. We created the maps using ESRI ArcMap 10.4.1 software http://desktop.arcgis.com/en/arcmap/. Full size image

Figure 2 Spatial distribution of long-term average accumulated growing degree days (AGDD) for different months of the year. We created the maps using ESRI ArcMap 10.4.1 software http://desktop.arcgis.com/en/arcmap/. Full size image

The annual AGDD trends (deviation from mean annual AGDD) in time domain (1900–2014) are presented on a national scale in Fig. 3. The deviation was initially close to zero, which rose to a positive maximum in 1939 and thereby started declining into negative deviations until the end of the study period in 2014. The national time series is derived from observed data at 1218 sites, and presents a national changes in AGDD, but it should be acknowledged that the constituent sites show highly variable trends and relying on a national series can conceal regional variations. These regional differences in annual AGDD trends can be seen in Fig. 1b. Overall, the AGDD trends can be divided into two regimes; positive trends in western U.S., and negative trends in eastern U.S., with the major exception of positive trends in the northeastern U.S. Southwestern U.S. generally shows relatively higher positive trends than those in the west. There are several regions throughout the nation which show trends in opposite direction than the surroundings (pink spots in yellow region and vice-versa in Fig. 1b), which imply that the geographic patterns in temporal trends are highly variable. The site-specific extremes in temporal trends were 1276 °C century−1 (in California) and −632 °C century−1 (in Mississippi), while the national average temporal trend was 19 °C century−1 because of the countering positive and negative changes in AGDD. These trends in annual AGDD result from varying trends in AGDD during different months and could be distributed in a particular manner during the year. To answer this, we present Fig. 4 that enables us to present and interpret the temporal trends in AGDD on a monthly time scale. When averaged nationally, positive (increasing) monthly AGDD trends were observed for all months, except for January, September, and October, which had negative (decreasing) trends. These trend values, however, cannot be compared amongst various months fairly because the AGDD climatology is considerably different for each month. In other words, trends in summer would be higher due to more accumulation of GDD’s from higher air temperatures and would be lower in winters for the opposite reason, and hence it would not be valid to make assessments by comparing these trends. To resolve this, we normalized the observed trend for each month by the average AGDD during that month, which enables us to make comparisons among trends during different times of the year. The resulting maximum increasing trends were found in February, followed by November and December, whereas the decreasing trends were the highest in January (2.5 times greater in magnitude than the maximum increasing trend) followed by October, while the rest of the months show comparable trend magnitudes (Figure S1). Although this analysis provides valuable insights into national-level monthly AGDD trends, it still does not provide information about regional-level dynamics in monthly AGDD trends. Figure 4 shows that significant spatial variability in monthly AGDD trends exists within the nation. In scenarios such as these, national-level trends tend to mask the finer-scale variability observed in Fig. 4. In other words, an indicator may show considerable spatial variability when studied at finer scales, but this variability can get concealed, when the same indictor is studied on a coarser national scale. Thus, relying on national-level interpretations can potentially be misdirecting. Instead, we recommend referring to the developed maps to consult for site-specific trends, rather than relying on national scale information for assessments.

Figure 3 National-level trends in first fall frost (FFF), last spring frost (LSF), climatological growing season (CGS) and annual accumulated growing degree days (AGDD). The trends are shown using deviation of each series from average from 1900–2014. Each series represents a 10-year running average. Full size image

Figure 4 Map showing temporal trends in accumulated growing degree days (AGDD) for different months of the year during the period 1900–2014. We created the maps using ESRI ArcMap 10.4.1 software http://desktop.arcgis.com/en/arcmap/. Full size image

Annual frost dates

The occurrence of first fall frost (FFF) and last spring frost (LSF) in terms of day of the year for the CONUS is presented in Figs 5a,b, respectively. Both FFF and LSF follow a north to south latitudinal spatial trend, except some extremes in the Rocky Mountain ranges. FFF occurs later in the year as we move north to south, while LSF occurs earlier in the year as we move north to south. The national averages of occurrence of FFF and LSF were day of year (DOY) 286 and DOY 115, respectively. The magnitudes of FFF vary by about 125 days throughout the CONUS with extremes in Wyoming (DOY 214) and California (DOY 339). Similarly, occurrence of LSF varies by about 146 days in the nation with extremes in Colorado (DOY 187) and Florida (DOY 41).

Figure 5 Spatial distribution of long-term average (1900–2014) (a) first fall frost (FFF); (b) last spring frost (LSF) and (c) climatological growing season (CGS) across CONUS. We created the maps using ESRI ArcMap 10.4.1 software http://desktop.arcgis.com/en/arcmap/. Full size image

The national level deviations in the occurrence of the FFF and LSF are shown in Fig. 3. The dates followed trends that are variable in time; with two periods with considerable deviations from the national average. Between these two periods, both dates occurred closer to the average, especially FFF (LSF occurred mostly later than average). One of these periods was before 1930, when the fall frosts occurred earlier (2–3 days) and the spring frosts occurred later (3–4 days). The other period was post-1990 for FFF and post-1970 for LSF, where FFF started occurring later and LSF started occurring earlier. On a national basis, the temporal trends in FFF and LSF were about 5 days century−1 (later occurrence) and −7 days century−1 (earlier occurrence), respectively. To be able to segregate and decipher national trends, we computed these trends throughout the CONUS at a spatial scale and the results are presented in Figs 6a,b. For FFF, the majority of the nation has positive trends that scale up to 20 days century−1, which implies that FFF has occurred up to 20 days later over the century, while even greater trends exist (up to 40 days century−1) in small parts of the western U.S. In contrast, there are some regions, for example in the Midwest, southeast, south and central southwest (colored in green), which show negative trends (up to −19 days century−1), which implies that FFF has occurred earlier in these regions by a magnitude of up to 19 days over a century. On the other hand, LSF occurrence is shown to be earlier for the most part of the nation (by up to −19 days century−1), along with some parts in southwestern U.S., which show even greater rates of negative trends (up to −39 days century−1). Delays in LSF (by up to 20 days century−1) were observed in some scattered parts in the west, southwest, south, and southeast. The maps developed in this section can be an invaluable resource to the scientific community as well as for decision and policymakers and resource managers and can be employed to generate quantitative information on FFF and LSF trends at any site in the CONUS.

Figure 6 Temporal trends in (a) first fall frost (FFF); (b) last spring frost (LSF), and (c) climatological growing season (CGS) across CONUS during 1900–2014. We created the maps using ESRI ArcMap 10.4.1 software http://desktop.arcgis.com/en/arcmap/. Full size image

Climatological growing season

Climatological growing season, by its definition, is the difference between the LSF and FFF and is reported in Julian days, same as LSF and FFF. Hence, any spatial and temporal changes in one or both of the annual frost dates will trigger a change in CGS as well. Figure 5c presents the spatial patterns associated with CGS across the CONUS, and we find that it follows a north to south (increasing) trend, which is similar to AGDD, FFF, and LSF in its latitudinal pattern. The national average CGS is about 170 days, and it extends from a minimum of 29 days in Wyoming to a maximum of 289 days in California.

The national time series for deviation of CGS from national average (Fig. 3) exhibits a sharp increase from about a 5-day shorter CGS around 1915 to a 2 day longer CGS around 1940, a gradual decrease thereafter until a 3-day shorter CGS around 1975, and finally, a sharp increase until a 10-day longer CGS in 2014. Overall, considering the temporal changes in CGS on a spatial level (Fig. 6c), the majority of the CONUS experienced trends towards lengthening of the climatological growing season by about 25 days century−1, while there were regions such as along the west coast and northern plains that showed even greater lengthening trends up to 75 days century−1. However, some regions in the southeast, south, and central southwest show trends towards shortening of the CGS by about 24 days century−1. The extremes in observed station trends in CGS were in Montana (a lengthening of 96 days century−1) and Washington (shortening of 47 days century−1), while on a national scale, lengthening of the CGS was observed at the rate of about 12 days century−1.

Trends based on agricultural belts and geographical zones

Crop-specific and climate region-level statistics were extracted from spatial information on the trends in FFF, LSF, and annual and monthly AGDD and are presented in Tables 1 and 2, respectively. Based on the statistics, we found that for all of the agricultural belts, there was an observed lengthening of the CGS, and the highest of these trends were for spring wheat and cotton belts, while maize belt experienced the least rate of CGS lengthening. These increases were because of delays in FFF and earlier occurrence of LSF, but the more dominant of the two shifts was the one in LSF, which shifted at a rate higher than FFF for all of the agricultural belts (Fig. 7). Although, crops such as winter wheat and spring wheat have different growing seasons than the rest of the summer crops, frost free days are still very important attributes of the growing periods of all these crops for growth and development and for their physiological functions for producing grains. Annual AGDD, was observed to have positive trends for spring wheat and cotton belts, but negative for maize, soybean, and winter wheat. Heat accumulation during the crop growing season (between planting and harvesting dates of each crop, which are presented in the Supplementary Table S1) decreased for all crops, except for spring wheat. This decrease in crop GDD was highest for winter wheat, followed by sorghum. Further, if we consider the monthly AGDD trends for the crop-specific growing season, it is interesting to note that both maize and soybean have trends such that during the early part of the crop growing season (April, May, June), AGDD shows increasing trends, while decreasing trends are observed for the rest of the season until near harvest in October. Also, these negative trends are greater for the soybean belt than maize belt. However, for cotton and spring wheat, the AGDD trends were positive for the entire growing season. Lastly, winter wheat belt was shown to have variable trends during its growing season, with dominantly negative trends for the initial part, from September until March, and positive trends thereafter until near-physiological maturity in July–August.

Table 1 Trends in agroclimate indices for the major CONUS agricultural belts. Full size table

Table 2 Trends in agroclimate indices for the CONUS climate regions. Full size table

Figure 7 Trends in first fall frost (FFF), last spring frost (LSF), and climatological growing season (CGS) during 1900–2014 representative of each U.S. agricultural belt. Full size image

From the climate region-based analyses, it was revealed that except the southeast region, all regions showed positive trends in CGS, positive trends in FFF, and negative trends in LSF. The highest rate of CGS lengthening and LSF advances were observed in the west, while the highest rates of FFF delays were in the northwest. Positive trends in annual AGDD were found in the west, southwest, northwest, northern Rockies and Plains, and northeast, while negative trends were found in the Upper Midwest, southeast, south, and Ohio Valley. Detailed statistics on monthly AGDD temporal trends are listed in Table 2. To present an overview of the agroclimate climatology, Supplementary Table S2 serves to provide long-term mean magnitudes of agroclimate indices for various U.S. climate regions. Using this information, the trends that have occurred in agroclimate indices can be related to the long-term mean spatial patterns.

Crop yield-agroclimate relationships

We attempted to explore relationships between inter-annual variability associated with crop yields and GDD accumulated during a particular crop’s growing season on a county scale for the CONUS. We used crop yield residuals (against time) regressed against seasonal GDD to characterize these relationships. Figure 8 presents these functions for each crop for different number of site-years data. The slopes of these relationships were negative for all crops, except cotton (both Pima and Upland varieties), which showed positive slopes. Hence, we found that maize, soybean, sorghum, and wheat (spring and winter) yields, on a pooled national scale, demonstrated reduced yields in higher GDD site-years, whereas cotton yields show increased yields in higher GDD site-years. However, it has to be recognized that these national-level relationships can mask finer scale relationships due to data aggregation and normalization of any potential location-specific trends. These relationships include geographic (sites) and temporal (years) information aggregated into a single linear function and hence pools significant variability and potentially causes loss of information, which is demonstrated by high scatter and low coefficient of determination (R2) (<0.01 for all crops, hence not shown on the curves). It is also likely that the nature of these relationships varies among counties in the same region as well as between the regions, as a given crop can have different sensitivities and response to increasing GDD in different regions due to numerous factors, including geographic differences; soil type; crop varieties grown; climate; soil, crop, and water management practices; nutrient management; and differences in other factors. This is similar to differential sensitivities of crop yields to changes in temperature and precipitation demonstrated in the literature6,7. If this is the case, the nationally pooled curves would moderate these opposing effects. One way to approach this is conducting a similar exercise on a single county, which allows our analyses to be fixed in space, and vary temporally. To further investigate this, a representative county was chosen for each crop which had maximum data records available and similar analyses were conducted. Supplementary Figure S2 presents crop yield residuals regressed against seasonal GDD for all crops, but for a single representative county for each crop species (maize: Antelope County, NE; soybean: Lawrence County, IN; sorghum: Montgomery County, KS; cotton-pima: Pinal County, AZ; cotton-upland: Tulare County, CA; winter wheat: Laramie County, WY; spring wheat-durum: Spink County, SD; and spring wheat-non-durum: Flathead County, MT). One striking difference that arises in this analysis is the increased R2 values, which range from 0.06 to 0.33. This signifies that crop yields for individual counties have a more pronounced response to GDD than what is interpreted from national-level curves. The natures of the relationships remained the same for both county and national scales (positive for all crops, except cotton).

Figure 8 County-level relationships among yield and growing degree days (GDD) pooled nationally for maize, soybean, sorghum, cotton (pima and upland), winter wheat, and spring wheat (durum and non-durum). Each regression curve includes n number of site-years in the CONUS during 1900–2014. Full size image

In a similar manner as mentioned above, we also explored relationships among county-level crop yield residuals and climatological growing season (CGS) length for the available site-years. Figure 9 presents these relationships, which are pooled for the CONUS under various crops. Spring wheat and winter wheat were excluded from this exercise, because their growing seasons generally include the frost/dormant periods, which are different than maize, soybean, and sorghum, hence it would not be worthwhile, at least for this study, to determine their relationships with frost-free period/CGS. For all crops, the crop yields had a positive response to increasing length of CGS. The R2 values, as with GDD, were low (<0.01) and hence not presented. This was, again, attributed to the spatial and temporal pooling of pairwise data, leading to aggregation of county-specific yield responses to CGS. When randomly-selected individual counties (maize: Pawnee County, NE; soybean: Vermillion County, IL; sorghum: Custer County, OK; cotton-pima: Pima County, AZ; and cotton-upland: Kern County, CA) were investigated for these relationships (Supplementary Figure S3), a similar relationship was observed as with GDD. The relationships yielded higher R2 values (up to 0.24) and we even found that maize yields showed negative response to increasing CGS, which is contrasting to the inference from the nationally pooled maize yield-CGS relationship, which had positive response. A very limited number of studies have looked into relationships of crop yield vs. agroclimate (GDD and CGS), and one study4 has reported correlation coefficients of −0.013 and 0.318 for Nebraska maize yield vs. growing season length and Nebraska maize yield vs. GDD, respectively. Their statistics (from Nebraska data) are somewhat comparable to our estimates from representative county analysis, because of similarity of scales, although they used state-level data as opposed to our county-specific approach. Moreover, they did not attempt to conduct their analysis on national scale, which is a knowledge gap our study has fulfilled. This finding further lends strength to our argument that the response of individual counties to changes in agroclimatic variables can vary spatially, both in nature and magnitude of sensitivity. Thus, while generalized assessments from national, continental, and global scale data can provide important inference for various applications to broader-level policy and decision-making, they would not be an accurate representation of individual county or finer scale trends and magnitudes in agroclimate vs. crop yield relationships for local policy and decision-making or strategy development.

Figure 9 County-level relationships among yield and climatological growing season (CGS) pooled nationally for maize, soybean, sorghum, and cotton (pima and upland). Each regression curve includes n number of site-years in the CONUS during 1900–2014. Full size image

Relatively weaker correlation between crop yields vs. CGS than crop yields vs. GDD, as inferenced from Figures S2 and S3 can be due to several reasons. Firstly, in practice, the spatial variation in CGS has led producers to adopt crop hybrids which suit a given site’s environment. For example, relative maturity of a maize hybrid planted in North Dakota may be around 80 days whereas for a hybrid planted in Texas, it may be up to 125 days. This explains why there is no observed north-south yield trend, similar to CGS (Fig. 5). This, however, doesn’t necessarily mean that the producers have also adopted to temporal changes in the CGS. Although it has been demonstrated that maize planting dates have shifted by approximately 2 weeks earlier relative to the early 1980’s8, this is not true for all crops and all regions considered in this study. Furthermore, even in a given state, producers may or may not plant different maturity groups of crops as a function of climatic gradients. Hence, crop yields might be slightly affected by variability in CGS in regions which have not adopted suitable varieties, as shown in Fig. 9. This low magnitude of crop response to CGS is expected because crop growing season tends to be narrower than the actual CGS in that region, and hence doesn’t affect crop yields. Even when crop failure occurs due to an early or later frost, producers usually opt for replanting, and hence may still get reasonable yields. GDD, shows a stronger response in crop yields, as it represents the actual daily growing conditions of the crop, while CGS merely represents a window around the crop growing season and hence, is responsible for affecting crop yields only in extreme scenarios (early or delayed frost).