A provocative study that linked police killings of unarmed black people with health problems in black infants has been retracted due to problems with the data used in the analysis.

A preliminary effort to repeat the analysis with corrected data found no significant link between police violence and infants’ chances of being born early or having a lower birth weight, said author Joscha Legewie, a sociologist at Harvard University.

Legewie said he was alerted to the problem by a reader who noticed mistakes in the way some of the killings had been classified.

“Seeing that is basically the nightmare of every scientist, so I immediately started digging into that,” he said.


The study, published this month in the journal Science Advances, combined California birth records and information from the Fatal Encounters database, a journalist-led project that uses public records to document fatalities involving law enforcement officers. It concluded that black women who lived near the site of a police fatality involving an unarmed black person during their first or second trimester of pregnancy were more likely to give birth earlier and have smaller babies compared with black women who did not live near such an incident.

The reader who contacted Lewgewie had found multiple cases in which people who were armed had been categorized as unarmed.

The message came a day after the study was posted online, and it prompted Legewie to request a retraction.

Editors at Scientific Advances issued the retraction on Thursday.


“I apologize that errors were not discovered before publication,” Legewie said in a series of tweets. “I am grateful that someone found the classification errors allowing me to investigate the issue and correct it quickly.”

https://twitter.com/jlegewie/status/1204973768004816901

Legewie had used the police violence data from Fatal Encounters to create his own database. He said he cleaned up and coded the entries to distinguish between cases involving armed and unarmed people.

After he was prompted to reevaluate his data, he broadened his search and discovered more misclassification errors. They were the result of two issues, he said: mistakes carried over from the original Fatal Encounters data — which he said he should have caught — and a bug in his HTML code.


The Fatal Encounters database has been a great resource, and its administrators are quick to fix any errors that are pointed out, Legewie said.

Once he repeated his analysis with the corrected data, the findings did not match those of his original study. The effect size was cut roughly in half and did not appear to be statistically significant, he said.

Those calculations were based on the subset of incidents that involved deaths of black people. Legewie said he planned to check all of the data, repeat the analysis and publish the revised findings — an effort that would take several weeks at least, if not months.

“Science is a self-correcting process; part of the purpose of placing research into the scholarly record is so that other scientists can attempt to replicate, confirm, or refute it,” Meagan Phelan, spokeswoman for the Science family of journals, which includes Science Advances, said in a statement. “This is the way in which science advances. This retraction is a clear example of that process in action.”


Jonathan Katz, a Caltech professor and the Science Advances deputy editor who handled the study manuscript, said in a tweet that he appreciated “how professionally and quickly” Legewie handled the retraction, calling the process “a triumph of research transparency.”

https://twitter.com/Jonathan_N_Katz/status/1205263568452505600

This was the journal’s second correction since its inception in 2015; the first one, published in April 2016, came seven months after the original study’s publication.

Dr. Ivan Oransky, co-founder of the website Retraction Watch, called the incident a “lightning-fast turnaround,” given that retractions can sometimes take months or even years to finalize. During that time, the study may continue to be cited by other researchers who aren’t aware that the findings have been questioned.


Though still relatively rare, retractions are becoming more common across the scientific literature, Oransky said. The number has shot up from about 40 in 2000 to more than 1,400 so far this year — an increase that has vastly outpaced the growth of scientific journals.

Part of the reason for that jump may simply be that there are more people who can scrutinize published scientific results, he said: “That’s making a big difference.”

Alyasah Sewell, a sociologist at Emory University who was not involved with Legewie’s study, said the retraction should have no bearing on other work showing that police violence can have consequences that ripple through communities.

“I think I would be careful about dismissing this body of research just because one paper was done incorrectly,” Sewell said.