Every year, the Centers for Disease Control and Prevention issues an estimate of the number of people nonfatally injured by guns, but even by the agency’s own standards, its recent numbers may not be trustworthy. This year’s estimate is less reliable than ever.

According to the CDC’s most recent figures, somewhere between 31,000 and 236,000 people were injured by guns in 2017. That range, which represents the confidence interval — the high and low ends of a range of estimates that probably contains the real number, whatever that number is — is almost four times wider than the one given in the agency’s 2001 estimate.

“When I looked at the 2017 numbers, I went, ‘Oh, my god,’” said David Hemenway, the director of the Harvard Injury Control Research Center. “You just can’t use those numbers.”

The CDC acknowledges its estimates are unreliable, but as it’s the nation’s premier public health agency, its figures are still widely used by researchers, journalists and the general public. That the latest numbers have become even more uncertain suggests that the CDC can’t be counted on to accurately estimate the number of gun injuries in the U.S. right now.

The agency’s 2016 and 2017 estimates are flagged with a note cautioning that the figures are “unstable and potentially unreliable.” The coefficient of variation, a measure of an estimate’s uncertainty where higher values indicate larger potential errors, rose from 30.6 percent in 2016 to 39.1 percent in the most recent estimate; for comparison, it was 22.1 percent in 2001, the earliest full year available.

“We accept that there are some unstable estimates,” CDC spokesperson Courtney Lenard said in an email. “CDC continues to look into various ways to strengthen the estimates for nonfatal firearm injuries.”

Last year, FiveThirtyEight and The Trace, a nonprofit news organization covering gun violence in America, reported that the rising trend in the number of nonfatal gunshot wounds in the CDC’s estimates was out of step with trends reported by other public health and criminal justice databases, which found flat or declining numbers of these injuries. The CDC’s most recent estimate — nearly 134,000 injuries — suggests that the upward trend in its data is accelerating, with injuries jumping over 57 percent between 2015 and 2017.

But that number is suspect, in part because the agency sources its data from a small number of hospitals: just 60 in 2017, according to data obtained in a public records request by The Trace and FiveThirtyEight. Drawing data from such a small pool means that a single hospital that treats a disproportionate number of gun injuries has the potential to drastically skew the entire estimate. In contrast, the Healthcare Cost and Utilization Project (HCUP), another database under the Department of Health and Human Services, uses data from more than 950 hospitals to create its own gun injury estimate, which contains much less uncertainty than the CDC’s. HCUP’s website also prevents users from accessing any estimate with a coefficient of variation greater than 30 percent.

Despite the issues with the CDC’s data, many academics have cited it in their work. Our previous reporting identified at least 50 research papers that have cited the CDC’s gun injury numbers since 2010.

“I would not cite these estimates,” Guohua Li, editor-in-chief of the medical journal Injury Epidemiology and director of Columbia University’s Center for Injury Epidemiology and Prevention, told The Trace and FiveThirtyEight. “As an editor, I would not publish any manuscript that is based on these estimates.”

Li says that the CDC could correct the uncertainty of its estimates by incorporating data from a much larger and more reliable source like HCUP. “If they want to fix it, I think it is definitely doable,” Li said.

Lenard, the CDC spokesperson, said that HCUP’s data sets have their own limitations and that making any changes to the database that underlies the CDC’s estimates would depend on congressional funding.

Li doesn’t believe that the limitations of HCUP’s data are significant enough to keep the CDC from using it.

“The data quality has become more important than ever, so they should really pay immediate attention to this issue and get it improved,” Li said.