The stocktake of algorithms in the public sector was designed to shine a light on the type of decisions that are now computer-assisted.

Computer algorithms are being used by government agencies to help make a wide range of potentially life-changing decisions about the treatment of Kiwis, according to a new report.

Police are using software to help predict whether people with a history of family violence are likely to commit crimes against their victims within two years.

Software is also being used by the Corrections Department to forecast the likelihood of inmates re-offending, when they come up for parole hearings.

Computer forecasts help determine how vigorously visa applicants are screened and whether people are offered automatic tax refunds by Inland Revenue.

But humans, rather than computers, have the final say on "almost all" decisions made by the public service, the report found.

READ MORE:

* AI 'snake oil salesmen' and protecting human digital rights

* 'Urgent' algorithm stocktake to show how Government uses our data

* 'Weapons of Math Destruction' or a force for good?

* Potential law change could make services explain data decisions

An exception appeared to be that more than 20,000 people between the ages of 15 and 24 have been automatically referred for help with qualifications or training from the Social Development Ministry after being assessed by a computer algorithm called NEET as being at risk of becoming long-term unemployed.

The report said police used a "static risk" algorithm to calculate the probability that a "family violence perpetrator" would commit a crime against a family member within two years, based on data they held that included their gender, past incidents of family harm and criminal history.

But all final decisions about "actions and interventions" were made at the discretion of police officers, it said.

Factors used to calculate the chances of offenders re-offending when they came up for parole include details of their prior offending, their age, gender, and the age of their first offence.

"The risk scores generated by the algorithm are considered together with the opinions of relevant qualified professionals including case managers, probation officers and psychologists," the report said.

TOM PULLAR-STRECKER/STUFF Clare Curran kicked off the work on the algorithm stocktake before standing down as minister.

The algorithm "stocktake" was ordered by former Digital Services Minister Clare Curran in May in the wake of concerns – denied by Immigration NZ – that it had been using algorithms to prioritise the deportation of overstayers based on factors including race.

A new privacy law called the General Data Protection Regulation came into effect in Europe this year that gives Europeans the right to an explanation when automated decisions are made about them.

It also gives people the right to have a human involved in any "significant" decision affecting them, unless there are suitable legal safeguards.

Before Curran stepped down as minister in September, she had been leading a project by the D7 group of digitally advanced nations, which include New Zealand, that could see Kiwis get similar rights.

Otago University professor Colin Gavaghan, who has been assisting the Government with that work, has cautioned that in order for such safeguards to be effective, people need to know algorithms are being used in decisions affecting them in the first place.

Government "chief data steward" Liz MacPherson said the report showed how algorithms were helping agencies deliver better policies and services "but it also reminds us of the need to take care in their use".

"There's plenty of scope to lift our game," she said.

Recommendations in the report included maintaining human oversight of decisions, "promoting transparency and awareness", regularly reviewing algorithms that informed significant decisions, and "monitoring for adverse effects".