Armed with new research, civil rights organizations are urging the Justice Department to investigate law enforcement’s excessive use of face recognition technologies.

A coalition of more than 50 civil rights organizations asked the Department of Justice today to investigate the use of face recognition technology by local and state police and the FBI, after a report by the Georgetown Center on Privacy & Technology revealed half of adult Americans in certain jurisdictions have had their images scanned by authorities’ face recognition software.

The Georgetown report also indicates that there is no significant regulation of the use of this technology, that the software can be inaccurate, and that the use of face recognition will disproportionately impact communities of color.

“Face recognition technology is rapidly being interconnected with everyday police activities, impacting virtually every jurisdiction in America. Yet, the safeguards to ensure this technology is being used fairly and responsibly appear to be virtually nonexistent,” the organizations wrote in a letter to the civil rights department of the Justice Department. The coalition asked DOJ to focus its investigation on police departments that are already under investigation for biased policing

In its letter, the groups urged DOJ to focus its investigation on police departments that are already under investigation for biased policing practices and to ensure that neither federal nor local databases are biased against people of color.

“This technology is not limited to serious criminals. It’s not limited at all, really,” said Alvaro Bedoya, the executive director of the Center on Privacy & Technology and a co-author of the report. “By standing for a driver’s license photo, you can be searched by the police or the FBI. This is not business as usual.”

The report, titled “The Perpetual Lineup,” draws its information from public records requests to 106 police departments across the country. The Georgetown researchers found only one department that had received legislative approval for its use of face recognition technology and that most departments do not audit their systems for accuracy or bias.

“The Perpetual Lineup” builds on 2012 research that showed face recognition systems were less accurate on people of color, women, and young people.

“Commercial face recognition systems are 5 to 10 percent less accurate on African Americans than Caucasians,” Bedoya said, adding that the National Institute of Standards and Technology had only tested for racial bias in facial recognition once and that two leading companies interviewed by Georgetown researchers could not identify any tests they had run on their own software to measure racial bias.

Despite these accuracy problems, the report points out that face recognition systems are already widely in use by American law enforcement agencies, some of which run recognition algorithms in real-time on crowded streets, scanning people’s faces without their knowledge. “We have to ask ourselves, ‘Does that look like America?'” Bedoya said.

The ACLU, which is one of the organizations involved in the letter to the Justice Department, questioned the use of face recognition technology during protests.

An ACLU report last week showed that police in Baltimore had used face recognition software in combination with social media posts to identify and arrest people who participated in a Freddie Gray protest.

ACLU’s legislative counsel Neema Singh Guliani said that such uses of face recognition technology could chill speech and explained that the ACLU is asking governments to “push a pause button and issue moratoriums on certain uses of this technology.”