Australian Federal Police (AFP) officers trialled controversial facial recognition technology Clearview AI from late 2019, despite the agency initially denying any association with the company.

Key points: Australian Federal Police officers have used facial recognition technology Clearview AI

Australian Federal Police officers have used facial recognition technology Clearview AI The confirmation comes despite the agency earlier denying any links with the controversial company

The confirmation comes despite the agency earlier denying any links with the controversial company Labor leaders and digital rights advocates are concerned about the agency's lack of transparency

Founded by Australian Hoan Ton-That, the New York-based start-up claims to have created an unprecedented database that contains billions of photos scraped from platforms like Facebook and Instagram and even employment websites.

The tool allows those with an account to scan a photo of an unknown person and locate additional images and identifying information about them from across the internet.

In response to a question on notice from Shadow Attorney-General Mark Dreyfus in February, the law enforcement agency admitted on Tuesday that officers had used the face-matching software.

Between November 2, 2019 and January 22, 2020, members of the AFP-led Australian Centre to Counter Child Exploitation (ACCCE) registered for free trials and undertook searches, raising questions about how their activities were supervised and why AFP leadership were not aware.

Labor leaders including Mr Dreyfus called on Home Affairs Minister Peter Dutton to explain whether he knew AFP officers were using what they called "a deeply problematic service".

"The Home Affairs Minister must explain whether the use of Clearview without legal authorisation has jeopardised AFP investigations into child exploitation," they said in a statement.

"The use by AFP officers of private services to conduct official AFP investigations in the absence of any formal agreement or assessment as to the system's integrity or security is concerning."

Seven AFP officers used Clearview AI

This is the first official confirmation that any Australian law enforcement body has taken up Clearview AI's software.

The company founder Mr Ton-That told the ABC in January he had "a few customers in Australia who are piloting the tool, especially around child exploitation cases", but at the time state and federal police forces either denied they used the technology or declined to comment.

The company issued nine invitations to AFP officers for a "limited pilot", according to the agency. Seven activated the trial and conducted searches.

"These searches included images of known individuals and unknown individuals related to current or past investigations relating to child exploitation," the AFP said.

The Clearview AI app was provided to Australian law enforcement on a free trial basis "with the best of intentions", Mr Ton-That said in a statement.

"At this time, the app is no longer operational in Australia," he said. "We will respond to government inquiries concerning the use of our highly effective, ground-breaking technology, about which there has been worldwide interest."

In February, internal company documents obtained by BuzzFeed News reportedly showed several registrations linked to AFP email addresses, as well as other state police forces.

Hoan Ton-That is the founder of Clearview AI, which has supplied hundreds of police departments with facial recognition technology. (Supplied)

Concerns grow about facial-recognition transparency

The New York Times reported Clearview AI was in use by police forces across the United States in January, starting a conversation about privacy and the collection of images shared by users of Facebook and other companies for face matching systems without consent.

Digital rights groups have also voiced concerns about the lack of regulation and limited transparency around police use of facial recognition technology in Australia.

The AFP rejected several Freedom of Information requests related to Clearview AI, saying no relevant documents existed, before now reporting the ACCCE did, in fact, hold information about the company.

David Paris, campaign manager at Digital Rights Watch, said it was "deeply concerning" the AFP was not previously aware of the conduct of its own staff.

It is not clear how the trial was overseen, or whether a privacy or security assessment of the software was undertaken.

"This suggests that the AFP has broken Australia's accountability and oversight mechanisms and is unfit to be trusted with our personal data," Mr Paris said.

Despite the trial undertaken by its officers, the AFP told Mr Dreyfus it had not formally adopted the software more broadly.

In a statement, an AFP spokesperson said it had no plans to do so.

"The AFP is in the process of undertaking an internal governance review, which may inform future trials of software-related capabilities," the spokesperson added.

The Office of the Australian Information Commissioner (OAIC) is also making inquiries with Clearview AI about whether it holds personal information about Australians, and if it is being used in Australia.

The Home Affairs Minister's office and the OAIC were approached for comment.