“So we have a technology that was created and designed on one demographic that is only mostly effective on that one demographic, and they’re trying to sell it and impose it on the entirety of the country?” Ocasio-Cortez continued. “Do you think this could exacerbate the already egregious inequalities in our criminal justice system?”

“We saw that these algorithms are effective to different degrees,” Ocasio-Cortez said (starting at about 3:51 in the C-SPAN video below). “So are they most effective on women?”

At the hearing before the House’s Oversight Committee, Ocasio-Cortez asked a series of pointed questions of one of the expert witnesses, Joy Buolamwini, founder of Algorithmic Justice League , an advocacy group for “more inclusive and ethical” artificial intelligence.

At a congressional hearing on the use of facial recognition technology on Wednesday, Rep. Alexandria Ocasio-Cortez (D-N.Y.) drilled down into how the technology can “exacerbate” racial bias in the criminal justice system.

When tech companies develop algorithms that automate the assumptions of people from one demo, they start to automate subconscious bias. When those flawed algorithms get sold to law enforcement, it can be disastrous for anyone that doesn’t look like most Silicon Valley engineers. https://t.co/SKxhYUgZEj

On Wednesday, Amazon shareholders decided at the company’s annual meeting not to ban sales of its facial recognition technology “Rekognition” to governments and law enforcement.

Facial recognition technology has drawn scrutiny for its flaws, notably for frequently improperly identifying darker-skinned people. In one high-profile test by the American Civil Liberties Union last year, Amazon’s facial recognition tool incorrectly matched the faces of 28 lawmakers with people in mug shots, disproportionately misidentifying people of color.

Last week, San Francisco became the first major U.S. city to ban the use of facial recognition technology by city agencies, including law enforcement, for this reason, among others.

“The propensity for facial recognition technology to endanger civil rights and civil liberties substantially outweighs its purported benefits,” the San Francisco ordinance reads. “And the technology will exacerbate racial injustice and threaten our ability to live free of continuous government monitoring.”

At the hearing, Buolamwini pointed to how facial recognition technology, in the hands of law enforcement, could worsen the racial biases already existing in the criminal justice system.

Black drivers, for instance, are about 20% more likely to be stopped by police than white drivers, per a study published in March by Stanford University’s Open Policing Project. And when it comes to searches, although police were more likely to find drugs, guns or other contraband in stops of white drivers, black drivers were searched about 1.5 to 2 times as often.

“If you have a case where they’re thinking about putting, let’s say, facial recognition technology on police body cams, in a situation where you already have racial bias, that can be used to confirm the presumption of guilt ― even if that hasn’t necessarily been proven,” Buolamwini said. “Because you have these algorithms that ... fail more on communities of color.”

Earlier this month, a police officer in Houston tried to arrest a black man in his own front yard after mistaking him for a different black man who allegedly had a warrant out for his arrest.

“Doesn’t that look a lot like you?” the officer asked the man in a video of the incident. In the video caption, the man explained that the officer had shown him a photo of another black man with dreadlocks who appeared to be older than he was.

“No, that don’t look like me!” he responded. “What are you trying to say, because I got dreads and I’m black, that’s me?”

The congressional committee is set to have a second hearing next month on the use of facial recognition by government and companies and the potential need for more oversight, according to a release from the office of the committee chairman, Rep. Elijah Cummings (D-Md.).