In Los Angeles, an algorithm helps decide who–out of 58,000 homeless people–gets access to a small amount of available housing. In Indiana, the state used a computer system to flag any mistake on an application for food stamps, healthcare, or cash benefits as a “failure to cooperate;” 1 million people lost benefits. In Pittsburgh, a child protection agency is using an algorithm to try to predict future child abuse, despite the algorithm’s problems with accuracy.

In a new book, Automating Inequality, Virginia Eubanks calls these examples of the digital poorhouse: tech-filled systems that come from a long history of cultural assumptions about what it means to be poor. In the 1800s, when actual, prison-like poorhouses were common, some politicians embraced the idea that people should only get assistance if they were willing to live in the poorhouse. The conditions were so bad that they thought it would discourage “undeserving” poor–who were seen as not working hard enough–from supposedly taking advantage of the system. By the late 1800s, the “scientific charity” movement started collecting data and opening investigative cases to decide who was deserving and undeserving.

New technology used in public services, Eubanks argues, comes out of the same old thinking. “It’s really important to understand that these tools are more evolution than revolution, even though we talk about them often as disruptors,” she says.

She started thinking about this use of technology in 2000, while talking with a young mother on public assistance who was using an EBT card, a payment card for food or cash benefits. “At the time, they were fairly new, and people were pretty excited about them–rationalizing that they were easier to use, there’s less stigma, you look like every other shopper when you’re in the grocery store,” Eubanks says. “She said ‘Yeah, all of those things are true, and it’s more convenient in a lot of ways, but also, my caseworker uses it to track all of my purchases.'”

The push to automate and computerize public services began early, in the late 1960s and the 1970s. Eubanks believes that it was a direct response to a national movement for welfare rights–people who were barred from getting welfare in the past, like people of color or never-married mothers, were suddenly able to participate. Some older rules were overturned in the courts, like the “substitute father” rule that a mother on public assistance shouldn’t get that support if she was in a relationship with a man (the rule led to welfare workers invading homes in the middle of the night to check beds for boyfriends). As welfare rights grew, so did a backlash. Technology–touted as a way to distribute aid more efficiently–began to serve as a barrier to limit the number of people getting support.

“I think that these technologies certainly were used to create efficiencies and to ease administrative burdens, but they were also used to help us avoid the very difficult political conversation that we needed to have at that moment,” Eubanks says. Instead of talking about how to deal with economic inequality or automation of jobs, “We replaced them with a set of questions that are really systems engineering questions: How do you get the most input out of the least output? How do you identify fraud and divert people from eligibility?”

In some cases, the intent to limit aid is more obvious. In Indiana, Republican governor Mitch Daniels launched a welfare reform program in 2006 that he argued would “clean up welfare waste” by automating and privatizing eligibility processes, and rejecting any applications with errors. In one family, a six-year-old girl with cerebral palsy was on Medicaid. When her parents applied for health insurance of their own, and then temporarily put that application on hold, the system automatically considered that a mistake–and punished the family for it by canceling Medicaid for all their children. Because of the automated system, denials of food stamps, healthcare, and cash benefits in the state increased 54%. Food banks ran out of food. Poor and working-class families organized, and the governor finally admitted that the program was a “flawed concept” and switched to a hybrid eligibility system that uses caseworkers along with automation (the new system was still flawed).