Judges are supposed to operate without bias and without letting their emotions influence how they make decisions. They are also human. And the idea that emotions in one realm shape decisions in another is not new. The stock market does better when the sun is shining, for instance. But the stakes, particularly for young people of color, are high.

The authors looked specifically at first-time offenders between the ages of 10 and 17 who were convicted of a single statute offense, like drug use or robbery, to “circumvent any potential confounding effects.” They excluded first- and second-degree murder and aggravated rape because those cases require mandatory sentences in Louisiana, and ultimately looked at about 8,200 records involving 207 judges.

Mocan and Eren found that the behavior of the children in court wasn’t a factor in sentencing. Economic background didn’t seem to play a role either. Cases are randomly assigned by a computer in Louisiana juvenile court, so judge selection wasn’t an issue. And a placebo test showed that non-LSU games didn’t have an impact.

The research is obviously limited in scope, and the authors looked at a state where football culture runs deep. It’s unclear whether judges in, say, California, would hand down longer sentences after a University of Southern California loss. Jeffrey Butts, the director of the Research and Evaluation Center at the John Jay College of Criminal Justice in New York, said the study seemed like “academic clickbait.” What are judges supposed to do, he asked rhetorically, not handle cases in the week following each unexpected loss?

Butts is open to good data analysis, he said, and appreciates transparency, but he has concerns about what he sees as a movement toward using large data sets for things like predictive policing, where police use math and data analysis to pinpoint potential criminal activity. That may be acceptable as long as it’s one tool in many, he said, but data shouldn’t drive the entire justice system.

Where some might argue relying on data would eliminate human bias, Butts worries it would reinforce and hide bias. Consider, he said, a 16-year-old drug user who lives in a neighborhood where everyone has a car and a rec room or a basement where neighborhood kids gather to smoke or shoot up or whatever. Those kids are going from private space to private space, so the chances of being seen by a cop and arrested are low. Now consider a 16-year-old drug user who lives in a two-room apartment where no one has cars. He and his friends wind up taking a bus or walking to a local park or alleyway, where the chances of being arrested are high. That second kid might get picked up more often, which might mean increasingly tough sentences. A human might be more aware of the context in which the kids are committing the crimes, Butts said, where an algorithm might fail.