After consulting the scores in a pretrial hearing, a judge decided to release Hubbard with monitoring. He was able to keep his job. “Relieved would be an understatement,” he told me, describing how he’d felt. “It was fantastic.”

As calls to abolish cash bail grow louder across the country —from politicians , activists , and even some in corporate America —an increasing number of jurisdictions are transitioning away from a monetary system to one dependent on risk-assessment algorithms like the PSA. California, a state that has historically set some of the highest bail in the country, passed a bill doing so last month. Risk algorithms can be used at various junctures—from sentencing to parole hearings—but their use pretrial has garnered the most attention. Some have heralded these algorithms as agents of change, tools that help abolish a system that, in the words of Senator Bernie Sanders, creates “ modern-day debtors prisons .” New Jersey’s reforms have seemed particularly successful: The Pretrial Justice Institute, a national nonprofit that studies bail practices, has given the state an A score, a grade no other state received.

But there’s a problem: Even as the algorithms are praised for minimizing cash bail and the inequality it creates, an increasing number of civil-rights activists worry they perpetuate racial disparities within the criminal-justice system. In July, more than 100 civil-rights groups, including the ACLU, signed a statement of concern urging jurisdictions to stop using the tools. In the same missive, they outlined how to properly implement the algorithms if states do decide to use them. As more states turn a critical eye toward their own systems—and face public pressure to avoid certain reforms—they may have to decide between implementing a technology that could be corrupted, sticking with a (reformed) cash system, or pursuing a new form of justice that doesn’t depend on either.

Activists argue that the algorithms are fundamentally flawed because the data they use to predict a person’s risk could be influenced by structural racism: The number of times someone has been convicted of a crime, for example, or their failure to appear in court could both be affected by racial bias. As a result, they say, any bias that’s baked into the data is replicated by the algorithms, but with the veneer of scientific objectivity.

“My concern [about using the tools] is that what you could have is essentially racial profiling 2.0,” said Vincent Southerland, the executive director of the Center on Race, Inequality, and the Law at the New York University Law School, which signed onto the statement. “We’re forecasting what some individuals may do based on what groups they’re associated with have done in the past.” Some activists also worry that even in jurisdictions that have adopted the tools in good faith, judges may not follow their suggestions in setting bail or other pretrial conditions, and the new systems may go unscrutinized because communities assume any problems have been fixed.