Algorithms like COMPAS simply mimic the data with which we train them. COMPAS’s authors presumably fed historical recidivism data into their system. From that, the program ascertained what factors tend to make a defendant a higher risk. It then applied the patterns it gleaned to subsequent defendants, like Mr. Loomis, to spit out sentences that comport with existing trends.

But an algorithm that accurately reflects our world also necessarily reflects our biases. A ProPublica study found that COMPAS predicts black defendants will have higher risks of recidivism than they actually do, while white defendants are predicted to have lower rates than they actually do. (Northpointe Inc., the company that produces the algorithm, disputes this analysis.) The computer is worse than the human. It is not simply parroting back to us our own biases, it is exacerbating them.

Even if you think Mr. Loomis’s sentencing procedure arrived at the appropriate result, the potential that the process the state took to arrive there was biased — in ways neither judges nor defendants nor prosecutors know — should alarm anyone.

Machine learning algorithms often work on a feedback loop. If they are not constantly retrained, they “lean in” to the assumed correctness of their initial determinations, drifting away from both reality and fairness. As a former Silicon Valley software engineer, I saw this time and again: Google’s image classification algorithms mistakenly labeling black people as gorillas, or Microsoft’s Twitter bot immediately becoming a “racist jerk.”

Algorithms also lack the human ability to individualize. A computer cannot look a defendant in the eye, account for a troubled childhood or disability, and recommend a rehabilitative sentence. This is precisely the argument against mandatory minimum sentences — they rob judges of the discretion to deliver individualized justice — and it is equally cogent against machine sentencing.

For example, algorithms are often programmed to assume unidirectional causation: If A, then B. But is it truly that defendants with higher rates of recidivism warrant longer sentences or is it that defendants with longer sentences are kept out of their communities, unemployed and away from their families longer, naturally increasing their recidivism risk? A judge could investigate this nuance.