Weeds are a fact of life for farmers around the world, and weeds influence many farming decisions either directly or indirectly. If left uncontrolled, weeds could reduce world food production by as much as 20–40% (ref. 20). To control the weeds and increase their marketable crop, farmers around the world have increasingly turned to herbicides. When viewed in isolation, the increase in herbicide reliance is troubling. Use of glyphosate herbicide in particular has received increased scrutiny due to its association with the most dominant GE crop trait. A dramatic increase in glyphosate use21 has justifiably generated concern among scientists, policy-makers and the general public. As this analysis shows, however, the increased use of herbicides may not be inherently bad, as sometimes these changes corresponded with lower toxicity. This analysis provides only a small component of the potential impacts related to herbicide use, and does not account for risks to the environment (or any potential benefits).

A variety of risk assessment methods can be used to compare herbicides, including the ‘risk cup’ method used by the US Environmental Protection Agency and other regulatory bodies. Regulatory agencies typically consider a wide variety of environmental and human health endpoints in their risk analysis processes. Risk analysis is complex even when considering only a single active ingredient, since multiple endpoints must be considered (applicator health, aquatic organisms, birds, insects, etc.). Risk analysis becomes far more complex when looking at multiple herbicides used across multiple crops. The results of a full environmental analysis are likely to be similarly mixed since soil persistence, leaching potential, and wildlife toxicity of these 118 herbicides differ at least as much as mammalian toxicity.

To fully understand the impacts of herbicide use changes, meaningful metrics that represent actual risk must be used. Previous analyses have attempted to quantify the environmental and health impacts of herbicide use over time, especially as it relates to adoption of genetically-engineered (GE) herbicide-resistant crops. Unfortunately, many of those efforts relied on fundamentally flawed metrics. In particular, the summed weight of herbicides applied with no regard for their relative toxicity is uninformative at best and misleading at worst7. Simply counting the kg applied is insufficient. Herbicide use rates range from grams to kilograms per hectare and depend on many factors, including the effectiveness of the active ingredient and the environment where it is applied. A large increase in the weight of herbicide applied could simply be due to a switch from a herbicide which is active at low doses to a less bioactive herbicide. Likewise, a reduction in the total weight of herbicide applied may not actually be indicative of reduced herbicide use, as a single herbicide may be replaced by many different herbicides with lower use rates, and could actually pose substantially greater risk to applicators and the environment.

This analysis corrects this deficiency of previous works, by using area-treatments as a more informative indicator of herbicide intensity. An upward trend in herbicide area-treatments was observed in all six crops that were analyzed, although the upward trend was preceded by a downward trend in soybean. This result is consistent with the ‘herbicide treadmill’ criticism suggesting that US crop production has become increasingly dependent on herbicides for weed control. No causal relationships can be determined from these data, however, and there are many factors that may have driven increased herbicide use over time. Use of tillage in the US has steadily decreased in most crops since 1996, though the rate of tillage reduction depends on the crop and growing region22. Whether or not tillage is used explicitly for weed control, most tillage operations will provide weed control benefits like killing emerged seedlings and burying weed seed. When tillage is reduced, farmers become more reliant on other weed control practices, including herbicides. At least some of the widespread increase in herbicide use is certainly attributable to adoption of conservation tillage practices. It is important, then, to weigh the concern of increased herbicide use with the benefits that may have also accrued.

Although no new major herbicide sites of action have been discovered in the last 25 years23, many new herbicide products have entered the market. Many of these new products contain multiple active ingredients. Increased marketing and use of these multi-ingredient products may have contributed to increased herbicide area-treatments, though this data set did not provide commercial formulation information so it is unclear whether this was the case.

Some researchers have blamed glyphosate-resistant crops and the resulting evolution of glyphosate-resistant weeds for increasing herbicide use in maize, soybean, and cotton2,6. While this explanation is plausible for these three glyphosate-resistant crops, it cannot explain the similar trends for increasing herbicide intensity in rice and wheat, since no glyphosate-resistant cultivars are commercially available for those crops. In fact, herbicide area-treatments increased at a faster rate in rice and wheat compared with the glyphosate-resistant crops, so the claim that glyphosate-resistant crops are the primary driver of increasing herbicide use is at odds with the empirical data. The broader problem of herbicide-resistant weeds (rather than the artificially narrow focus on glyphosate) may certainly have played a role in increasing herbicide use for all of the crops in this analysis. The most likely explanation, though, is probably a combination of inter-related factors and is far more complex than any single driver.

The EIQ commonly used in previous analyses of herbicide use over time suffers from severe methodological flaws12,24 that are even more pronounced when comparing herbicides13. The hazard quotient approach used here, while certainly not perfect, is a far more defensible metric with which to compare herbicide toxicity and relative impacts of herbicide use changes, albeit for a small subset of potential toxicity endpoints. The hazard quotient, as applied here, does not take into account potential interactions between multiple herbicides. As an increasing number of herbicides are applied per hectare, the risk of negative interactions necessarily increases, although there is little evidence to date that negative interaction effects are of major concern to applicators or to the environment.

This analysis was limited to mammalian toxicity, and therefore is most relevant to chronic and acute risks faced by pesticide applicators, and to a much lesser extent, consumers. This analysis should not be extrapolated to draw conclusions about non-mammalian systems, and should be interpreted with caution even for human health risks. However, Peterson25 demonstrated that lower tier risk assessment approaches such as the hazard quotient used here are indeed correlated to more in-depth risk analyses, and therefore the hazard quotient results are likely representative of actual risk.

Acute herbicide toxicity is relatively simple to quantify and interpret, since the endpoint of interest in acute toxicity testing is mortality. To put it bluntly, it is simple to determine whether a rat is dead or alive. The herbicide dose resulting in death of 50% of test animals (LD 50 ) is a standard measure of acute toxicity, and is required as part of a standard set of pesticide safety studies to obtain regulatory approval. Chronic toxicity is more difficult to quantify and standardize, since the endpoint of interest can vary widely; liver deformations, cancers, reduced body weight, or any other departure from a healthy test population can indicate chronic toxicity issues. Chronic studies also have greater variation in study design, test species, duration, and endpoints measured, adding to the complexity. When making pesticide registration decisions, a variety of chronic studies conducted on a variety of test organisms are evaluated in an attempt to determine the most relevant endpoints and to set residue tolerances, acceptable use rates, and acceptable daily intakes. This makes it somewhat difficult to make comparisons between herbicides with respect to chronic toxicity comparisons.

Of the chronic toxicity data that are readily available for herbicides, the no observable effect level (NOEL) from 24-month chronic rat studies is the most consistent, and was therefore chosen to compare the chronic toxicity of herbicides in this analysis. This choice has the benefit of allowing an ‘apples to apples’ comparison of various herbicide active ingredients, since the chronic studies were conducted on the same test species for the same amount of time. However, rat NOEL values do not necessarily relate directly to human health risk. For some chronic effects, the rat is not an ideal test model for humans, and rabbit or dog studies may provide results more relevant to applicator health risks. Selecting different test species for different herbicides would be a potential source of bias in this analysis, so the same test organism (rat) was used for all active ingredients.

Of particular note in these results is that acute toxicity hazard (which is commonly cited by proponents of GE technology) was not always similar to chronic toxicity results. For example, the acute toxicity hazard decreased in cotton while the chronic toxicity hazard increased. Overall, acute mammalian toxicity of herbicides used in the US has decreased over the last 20 to 25 years for four out of six crops, while chronic toxicity has decreased for two of the six crops. It is important to note that the Mann-Kendall statistical test in Figs 3 and 4 only evaluates monotonic trends over the entire 25 year period. In some cases, more recent trends may be important even where the overall trend is non-significant (e.g. chronic toxicity in spring wheat, Fig. 3), or may even be reversed compared with long-term trends (e.g. acute toxicity in cotton, Fig. 4). The largest decreases in both hazard quotients were a result of discontinuation of several products with relatively high toxicity including alachlor, cyanazine, and molinate. In this regard, the EPA's decisions to discontinue these products appear to have had a beneficial effect on applicator health risks.

Because adoption of genetically engineered (GE) herbicide-resistant crops was so rapid and so widespread, the temporal component confounds the ability to define causal relationships between adoption of GE crops and herbicide use trends described here. Brookes and Barfoot10 convincingly explain that extrapolating recent non-GE herbicide usage to represent what all non-GE crop growers would be doing in the absence of GE technology is problematic for several reasons. The minority of growers not using GE technology today are probably not representative of all growers, and therefore their pesticide use is almost certainly not an accurate way to compare overall pesticide use between GE and conventional crops. For example, farmers might not adopt glyphosate-resistant crops because weed densities on their farm are relatively low, or if the farmer is not managing herbicide-resistant weeds. Herbicide use is likely to be lower for these non-adopters regardless of which technology they use for weed control. Results of these comparisons would likely bias results toward higher herbicide use in GE crops.

Increased use of glyphosate was an obvious result of US farmers adopting glyphosate-resistant maize, soybean, and cotton. Increased glyphosate use has spurred debate about the safety of glyphosate, with the World Health Organization International Agency for Research on Cancer (IARC) declaring that glyphosate is ‘probably carcinogenic to humans’26 while the US Environmental Protection Agency (EPA) recently concluded that glyphosate is ‘not likely to be carcinogenic to humans at doses relevant to human health risk assessment’27. Neither the IARC nor EPA analyses assess whether glyphosate use is better or worse than herbicides (or other weed control strategies) that would be used in its place.

Although USDA data do not allow direct comparison between herbicide use in glyphosate-resistant versus conventional varieties, some general conclusions can be drawn in this regard. Glyphosate has an approximate acute of 5,037 mg kg−1, with some variation depending on which salt is applied. This makes glyphosate less acutely toxic than 94% of the herbicides in this data set. Although glyphosate is considered a relatively safe herbicide with respect to acute toxicity, it is not an outlier in this regard. The median acute for herbicides in this analysis was 3,556 mg kg−1, and only five herbicides had acute of less than 500 mg kg−1, placing them in EPA’s toxicity Category II (Fig. 2). Therefore, the contribution of glyphosate to acute toxicity was nearly the same as its contribution to herbicide use as measured by area-treatments; that is, if glyphosate made up 20% of area-treatments, it typically contributed to just under 20% of the acute hazard quotient.

Chronic toxicity was a different story, however. Glyphosate has a lower chronic toxicity than 90% of all herbicides in this analysis, but it falls much further from the median chronic toxicity value compared with acute toxicity (Fig. 2). In the last year of survey data for each crop, glyphosate made up 26% of maize, 43% of soybean, and 45% of cotton area-treatments, but only contributed 0.1%, 0.3% and 3.5% of the total chronic hazard quotients in those crops, respectively. So although the chronic hazard quotient increased in 2 of 3 glyphosate-resistant crops, if glyphosate were not used the chronic hazard quotient would almost certainly be even greater since other herbicides with greater chronic toxicity would have been used instead. Similarly, if glyphosate use were discontinued (as was recently proposed in the EU) the resulting displacement of glyphosate by other herbicides is likely to have a negative impact on chronic health risks faced by pesticide applicators28.