Photo : Peter Macdiarmid ( Getty Images )

Correctional facilities across the nation have been struggling with the issue of overcrowded prisons. This had been a major problem all on its own, but one that has become exacerbated trough the coronavirus pandemic that threatens inmates and prison staff with exposure to infection.


On Thursday, U.S. Attorney General William Barr ordered the federal Bureau of Prisons to release a number of inmates, specifically those who are sick or elderly, to home confinement.

There are “at-risk inmates who are non-violent and pose minimal likelihood of recidivism and who might be safer serving their sentences in home confinement,” Barr wrote in a two-page memo to the BOP.


It is a measure that is largely welcomed by corrections officials as well as at-risk inmates and their families, but reporters and researchers from The Marshall Project, a non profit news organization specializing in the U.S. criminal justice system, fear that Barr’s plan may exclude certain prisoners and that those prisoners will likely be disproportionately people of color.

That’s because it instructs the prison system to prioritize for release only those prisoners who receive the minimum possible score on a “risk assessment” algorithm called PATTERN. This computerized rating system, which has never been used before, deems white-collar offenders, who are disproportionately white, generally safe to be let out of prison. But it does not deem safe to release drug addicts with a history of prior arrests, who are disproportionately black due in part to the biased policing practices of the War on Drugs. Only 7 percent of black men in federal prisons would be considered low-risk enough to get out using PATTERN—compared with 30 percent of white men, according to an internal assessment conducted by the Justice Department last year.

According to Barr’s memo, the Justice Department’s policy also excludes non-citizens convicted of immigration-related crimes barring them from eligibility for home confinement release. (And, I think we all know that white immigrants aren’t the ones most likely to be targetted by law enforcement.)

Sakira Cook, director of the Justice Reform Program at The Leadership Conference on Civil and Human Rights, told TMP that she doesn’t believe enough is being done.


“The Trump administration keeps touting its commitment to criminal justice reform, but its attorney general is not using his discretion in any way to ensure the health and safety of people in the system—including prison guards, who are also at risk due to overcrowding,” said Cook.

The PATTERN risk-assessment algorithm was originally created as part of the First Step Act, a bill President Trump signed into law in 2018 aimed at reducing the federal prison population and touted by Trump as successfully executed prison reform and the reason black voters should be falling in line behind his reelection.


But according to the National Institute of Justice, a research wing of the Justice Department, the algorithm was never fully tested or independently reviewed before it was put into use.

From TMP:

Risk-assessment algorithms are computerized versions of the appraisals that judges make every day: considering a person’s criminal history and demographics to score them on their likelihood of committing another crime. But these mathematical tools have faced criticism for giving the appearance of being colorblind—it is a computer making the decisions about whom to release, after all—while in fact exacerbating racial disparities. “There’s racial bias inherent at every step of the criminal justice system, from policing to prosecution to sentencing, so an algorithm that uses those things to determine whether to release you is not going to be fair to people of color,” said Kara Gotsch, director of strategic initiatives for The Sentencing Project.


Recently, The Root reported on a study showing that the algorithms voice and facial-recognition systems learn from are based on predominantly white databases, causing a significantly larger error rate for automated systems transcribing black voices as well as more instances of misidentification. While this isn’t nearly as serious as racial bias in deciding who is and isn’t released from prison, it illustrates how relying on computerized assessments to make human decisions can put people of color at higher risk for unfair treatment than our white counterparts.

Gotsch believes that the use of the PATTERN assessment will likely ignore important factors such as age, health, vulnerability to infection, behavior while in prison and the length of time already served in prison. Some of these things are mentioned in Barr’s memo, but it is still believed that inmates will be unfairly ruled against and denied eligibility for release.


On top of all that, a requirement included in the new policy says that inmates who are selected for home confinement release must first spend 14 days in quarantine before they are let out of prison, leaving prison release advocates and critics of the new policy with the same question many of you reading this are probably asking yourselves: H ow is that not just solitary confinement?

Despite concerns, Barr insists that the criteria for which inmates may become eligible for release is solid. Further, he implies that any expansion on criteria may put the general public at risk.


“While we have an obligation to protect B.O.P. personnel and the people in B.O.P. custody, we also have an obligation to protect the public,” Barr Wrote.

But, as long as it’s exclusively non-violent inmates who are being released, how could it hurt to scrutinize the process by which they’re selected and ensure we’re not allowing an algorithm to decide what’s most important?