The majority of commercial facial-recognition systems exhibit bias, according to a study from a federal agency released on Thursday, underscoring questions about a technology increasingly used by police departments and federal agencies to identify suspected criminals.

The systems falsely identified African-American and Asian faces 10 times to 100 times more than Caucasian faces, the National Institute of Standards and Technology reported on Thursday. Among a database of photos used by law enforcement agencies in the United States, the highest error rates came in identifying Native Americans, the study found.

The technology also had more difficulty identifying women than men. And it falsely identified older adults up to 10 times more than middle-aged adults.

The new report comes at a time of mounting concern from lawmakers and civil rights groups over the proliferation of facial recognition. Proponents view it as an important tool for catching criminals and tracking terrorists. Tech companies market it as a convenience that can be used to help identify people in photos or in lieu of a password to unlock smartphones.