Rep. Rashida Tlaib this week toured the Detroit Police Department's real-time crime center, where she called facial recognition technology "broken."

The congresswoman also drew controversy after suggesting that only black analysts should review facial recognition cases, concerned that black people might be falsely identified.

Jason Carr will host a live discussion at 11:30 a.m.

Here are the key points:

How does facial recognition work?

Facial recognition systems are used to extract a face from a scene and compare it to a database of stored images in order to identify an individual.

The software systems can measure distinguishable landmarks on the face, known as nodal points. Each human face has approximately 80 nodal points. These include the distance between the eyes, width of the nose, shape of the cheekbones and length of the jawline. The measurements are made on a sub-millimeter scale.

Nodal points are used to create a numerical code called a faceprint, which represents the face in the database.

3D models can be captured using a video image. They are considered more accurate and can even be used in darkness. 3D models can identify an individual turned up to 90 degrees from a camera. With 2D, the head must be turned no more than 35 degrees from the camera.

Read more: How Facial Recognition Systems Work

Facial recognition in Detroit

The Detroit Police Department has been using facial recognition technology for nearly two years with concerns from the ACLU and other civil rights groups.

Last month , the Board of Police Commissioners approved guidelines that endorse the use of the technology along with safeguards to prevent its misuse. Police use of facial recognition is now restricted to still photos connected to violent crime and home invasion investigations. Several layers of approvals within the department are required before the technology can be used. Use of the technology to identify people at political events like protests is prohibited.

Police Chief James Craig says the technology has already been used to identify suspects in numerous violent crimes.

Rep. Tlaib raises concerns during tour with Detroit police chief [article]

Rep. Rashida Tlaib toured the DPD's real-time crime center Monday night. Craig invited her after they battled on Twitter about police use of facial recognition technology, which Tlaib is opposed to.

This is how the facial recognition process works, according to one analyst at the center: "There was a shooting at the location. They said they have video of the incident, so we got a copy of the video and saw where we could get the best screenshot of the suspect." "Once we run it through the software, it spits out all these mugshots. There is roughly about 178 mugshots that come up." "Then it is up to up to us as analyst being the human component to look through and actually identify the person, if we can."

During her tour, Tlaib suggested that only black analysts should review cases, concerned that black people might be falsely identified by the technology. "I think non-African Americans think African Americans all look the same," Tlaib told police. Tlaib said Wednesday she stands by her comments "that facial recognition technology is broken, and that relying on an analyst that might not be reflective of the suspect is flawed in itself as well."



Is facial recognition technology "broken?"

Tlaib's comments spurred accusations of racism against the congresswoman, but there has been evidence to show that facial recognition technology is more likely to misidentify people of color.

US government tests have shown that facial recognition systems misidentify blacks five to 10 times more often than whites, according to a July report published by the National Institute of Standards and Technology. Algorithms designed by French company Idemia falsely matched white women's faces at a rate of one in 10,000, but falsely matched black women's faces at a rate of one in 1,000. The NIST report covered tests on 50 companies, and consistently found that algorithms had a harder time recognizing people with darker skin. White males were found to have the lowest false match rate, while black females had the highest, according to the report.

An MIT study published last year found that gender was more likely to be misidentified in darker-skinned females. Gender was misidentified in 35 percent of darker-skinned females in a set of 271 photos. With lighter-skinned males, it was misidentified only up to one percent in a set of 385 photos. The reason for this may be that the data used to train facial recognition systems is more representative of males and people with lighter skin, the study suggested.

In July, the journal Proceedings of the National Academy of Sciences published a study showing that the face-recognition part of white people's brains showed more activity when viewing white faces than when viewing black faces.

Facial recognition around the country