The federal government’s proposed new powers to undertake facial recognition identification and surveillance are dangerously overbroad, and could dramatically alter the freedom of ordinary people going about their daily lives, the Human Rights Law Centre told a Parliamentary committee today.

New laws proposed by the Department of Home Affairs would authorise the creation of a “dragnet database”, compiling images of innocent Australians – including children – from their drivers’ licences, identification cards and passport photos.

Dr Aruna Sathanapally, Director of Legal Advocacy, said the proposed database will startlingly include images of the vast majority of Australians.

“In other countries, there is serious debate about the police retaining the images of innocent people. Yet, here in Australia, our government is proposing letting not only police, but government departments, local councils, transport authorities and even private companies, access and search for matches across a database that will collate Australians’ personal information, linked to a biometric profile of their face”

The proposed laws would allow the technology to be used for a broad range of purposes, including investigating such minor offences as a parking infringement. The Human Rights Law Centre questioned the lack of evidence to justify these new expansive powers and the absence of detail as to how the government proposes to regulate the face recognition capabilities.

“Facial recognition and biometric technology threatens to outpace the laws we have in place. What the Government is proposing would effectively leave the rules governing new, powerful forms of surveillance to be worked out by the Home Affairs department and in the hands of the Home Affairs Minister. Frankly, this isn’t good enough for such a dramatic new set of powers, not in a democracy” Dr Sathanapally said.

In addition to major privacy concerns the Human Rights Law Centre highlighted significant risks to freedom of expression and other democratic rights.

“Those attending a public meeting, a protest or a vigil should not have to reveal their identities to exercise their democratic right to engage with others and gather peacefully,” said Dr Sathanapally.

Ultimately, it is not even clear how useful face recognition technology are likely to be, even if used to identify serious criminals. International experience trialling facial recognition technology has produced high rates of mistaken identifications – with up to 98% of results wrongly identifying people as criminals in a recent police operation at Notting Hill Carnival in London.

“We need to tread carefully with technology that carries high rates of false matches in general, and in particular, when faced with an individual belonging to a minority ethnic group. When false matches are made, innocent people will suffer the consequences, and already disadvantaged minorities may bear the brunt of this. These new tools risk eroding trust in government agencies, law enforcement and security agencies unless we have robust testing and safeguards in place.”

The Human Rights Law Centre’s submission is available here.