. Credit:Illustration: Matt Golding Victoria Police are also not forthcoming about any plans to expand facial recognition technology more broadly through other types of surveillance equipment. In July, Police Minister Lisa Neville unveiled a squad of 50 new “eye in the sky” drones, fitted with 360-degree cameras, that police can use for search-and-rescue missions, crime-prevention, and counter-terrorism. At the time police did not rule out combining the aerial devices with facial recognition software, saying there was “certainly the opportunity” to do so in the future. A government spokeswoman said Victoria Police “have a range of methods in place to search for known offenders for investigative and intelligence gathering purposes” but there were currently no plans to use facial recognition technology on the new fleet of drones. However, the growth of such technology is inevitable, with the spokeswoman telling The Age: “There is ongoing work across law enforcement agencies and Australian jurisdictions including the Commonwealth government on the future use of emerging technologies to assist with community safety and law enforcement.”

Human rights and privacy advocates say there is not enough transparency surrounding the network, and some have called for tighter safeguards. Overseas trials have highlighted privacy concerns, as well as large numbers of mismatches known as “false positives” and a higher tendency to mis-identify ethnic minorities and women. London's Metropolitan Police used facial recognition at the city’s Notting Hill carnival in 2016 and 2017, and at a Remembrance Sunday event, but its system incorrectly flagged 102 people as potential suspects. And in the US, Axon, which manufactures the body-worn cameras used by Victoria Police, recently considered fitting its products with artificial intelligence and facial recognition capabilities, until its own ethics board warned against it. Victoria Police wearing body cameras. Credit:Joe Armao Debate over artificial intelligence and facial recognition reignited this week when Australia’s Human Rights Commission called for a moratorium on the use of some technologies until there is a legal framework to safeguard human rights.

Victoria Police declined to discuss the rate of false positives, other than to insist that “since the rollout of these [iFace] cameras in 2015, police have always had the ability to override any decisions made by the system at any time”. “During offender processing at locations where an iFace camera is in use, there are a range of techniques in place to prevent someone being linked to the wrong image,” the spokeswoman said. Victoria Police’s iFace program uses algorithms to measure features such as face width and the distance between nose, eyes and mouth before comparing the image against the facial characteristics of known offenders to generate a match. It currently has no capability to run searches against CCTV footage or video, police say, because “all searches require a still image and are performed after an offence has occurred”. Human rights and privacy advocates say there is not enough transparency surrounding the network, and some have called for tighter safeguards to ensure that vulnerable people are not unfairly targeted. “Surveillance and tracking technology is susceptible to existing biases and prejudices,” said Anthony Kelly, who heads the Police Accountability Project at the Flemington Kensington Community Legal Centre.

Loading “It will inevitably be pointed towards and most impact those who are already targeted by police – the poor, the mentally ill and people who are politically active.” As the use of AI technology expands, some point to the risks in China – which is creating a mass government surveillance system that can match faces to a database of 1.3 billion ID photos in seconds – as a cautionary tale. “The potential human rights impact is enormous and unprecedented,” the Australian Human Rights Commission wrote in a recent discussion paper. "AI, for example, can have far-reaching and irreversible consequences for how we protect privacy, how we combat discrimination and how we deliver health care — to name only three areas.”