Earlier this week, FiveThirtyEight collaborated with The Trace to report that the CDC is publishing unreliable data on nonfatal firearms injuries.
Fatal firearms injury data from the CDC is based on death certificates. Nonfatal injury data is collected through a survey conducted by the Consumer Product Safety Commission. The survey is small – only 100 out of 5,534 registered hospitals are surveyed and only 66 provide the relevant data. That is less than 2% of all hospitals in the country. The results are harrowing.
“The agency’s most recent figures include a worrying uptick: Between 2015 and 2016, the number of Americans nonfatally injured by a firearm jumped by 37 percent, rising from about 85,000 to more than 116,000. It was the largest single-year increase recorded in more than 15 years.”
This trend is at odds with the trend from four other estimates of the number of non-fatal firearms-related injuries based on hospitalization and crime data. This means that the CDC nonfatal injury data is completely unreliable – as the CDC itself admits. Noted anti-gun researcher David Hemenway is quoted as saying, “No one should trust the CDC’s nonfatal firearm injury point estimates.” This analysis is too late for the 50 or more academic papers since 2010 that have used CDC estimates on nonfatal firearms injuries.
The Consumer Product Safety Commission responded to the authors’ questions by claiming, “Although visually, the [CDC] estimates for firearm-related assaults appear to be increasing from 2015 to 2016, there is not a statistically significant difference between the estimates.” There’s so much variance in the data that the true number of nonfatal firearms injuries for 2016 is, with 95% probability, somewhere between 46,524 and 186,304. As Hemenway said, “Basically, the confidence intervals are enormous. So you have no idea about trends.”
The survey is ripe for such problems, given its small sample size. Variations across regions can strongly shift the estimate in one direction or another. Hospitals near high-crime neighborhoods in Baltimore or Chicago likely see more nonfatal gunshot injuries than hospitals in rural Vermont. The estimate is based off the hospitals that participate, and so can be easily skewed.
While The Trace should be commended for this analysis, the opportunity to complain about the lack of federal funding for “gun violence research” was too much to pass up. They note that the Dean of the Boston University School of Public Health believes we have “lost a generation of firearms research” and cite several articles bemoaning the state of “gun violence research.”
We’d like to point out that thousands of studies related to guns, crime, and violence have been conducted in the last 20 years. Not all of the research passes methodological muster, some is clearly biased in addition to being seriously flawed, anti-gun researchers acknowledged federal funding isn’t an obstacle for such research in Science magazine, and the federal government spent more $11 million in grants funding gun violence research between 2014 and 2017.
There are important questions in the wake of this analysis. Perhaps chief among them, why is the Consumer Product Safety Commission running this survey for the CDC when other sources are gathering similar data? We’d wager that hospitals across the country have precise counts of their patients’ ills. Hospitals tend to keep track of their patients. In many states, they’re also required to report gunshot injuries to law enforcement agencies, so this specific type of injury is already being recorded elsewhere. We just have to hope that efforts to collect better data will be put to good use, and not used in convoluted attempts to undermine our Constitutional rights.
Note to readers: Please click the share buttons above or below. Forward this article to your email lists. Crosspost on your blog site, social media, internet forums. etc.
Source link