The most frightening thing about the Centrelink malware debacle is the verve with which the government embraced it. Credit:Erin Jonasson What Morrison and Porter promised was an automated system that would issue Centrelink debt notices "better" than human beings. Humans did the job extremely well. A former Centrelink worker with 30 years experience says they would "look at start dates for employment that customers had declared, see if it was the same for the employer [using Tax Office records] and roughly work out if it lined up." "If it looked as if a person had possibly been overpaid they would write to the customer and ask them to call and tease out where the discrepancy was, and ask for proof, if it was still available, in the form of things such as payslips. If the customer didn't have them and it looked like there was a possibility of an overpayment, they would write to the employer to ask for the information. If evidence was collected that the customer had not declared the income correctly and a debt existed, then the debt calculator would raise the debt in accordance with the legislation and the customer would be written to." What's important in this description is the humans charged with applying the law didn't issue debt notices unless they had evidence that a debt existed. To do so without evidence would be to break the law.

But a wrongly-programmed computer need have no such scruples. Even better, its decisions can be presented as objective, hard to overturn. Data scientist Cathy O'Neil outlines scores of examples in her new book Weapons of Math Destruction, from the systems used by credit rating agencies in the lead up to the global financial crisis, to systems that automatically select teachers for the sack on the basis of secret algorithms that grade performance, to systems that deny people job interviews on the basis of proxies for mental health, even though that's illegal. They are used because they are quick rather than accurate. As an expert told O'Neil, the primary purpose of a workplace hiring system is "not to find the best employee, but to exclude as many people as possible as cheaply as possible". By necessity, they do it unfairly. People who are wise to the systems will mention the right words in job applications to get to the top of the pile. As she says, they are usually not from racial and ethnic minorities. Here, it's the persistent and well-resourced people who get the better of Centrelink. They are unlikely to be the hardest up. Many of the automated systems are malicious, created to do harm in the guise of providing a service. The formula used by Centrelink produces consistently false estimates of debts by dividing by 26 the annual wages employers report paying in order to overestimate income received during the smaller number of fortnights claimants get benefits. Humans didn’t issue debt notices unless they had evidence a debt existed. To do so without evidence would be to break the law.

The formulas used by for-profit colleges in the US to target internet advertising zero in on single mothers of colour who are poor enough to earn the colleges' valuable subsidies and ill-informed enough not to twig to debt. The formulas O'Neil herself worked on in financial markets presented securities as safe that weren't. O'Neil says that to be a "weapon of math destruction" a formula has to be used en masse (as the Centrelink formula will be), it has to be difficult to question (as the Centrelink formula will be for many people) and it has to cause damage (as the Centrelink formula is doing). Loading That isn't to say that credit risk and Centrelink and other software can't be designed to do the job better than humans. It's a worthy aim, one Morrison and Porter apparently thought they had achieved. The man Turnbull hired to prevent such stuff-ups describes what happened as as "cataclysmic". Paul Shetler left the prime minister's Digital Transformation Office in November as the Centrelink debt collection program gathered pace.