Washington, DC (CNN Business) Federal researchers have found widespread evidence of racial bias in nearly 200 facial recognition algorithms in an extensive government study, highlighting the technology's shortcomings and potential for misuse.

Racial minorities were far more likely than whites to be misidentified in the US government's testing, the study found, raising fresh concerns about the software's impartiality even as more government agencies at the city, state and federal level clamor to use it.

In a release, Patrick Grother, one of the researchers behind the report, said race-based biases were evident in "the majority of the face recognition algorithms we studied." Compared to their performance against whites, some algorithms were up to 100 times more likely to confuse two different non-white people.

Asians, blacks and Native Americans were particularly likely to be misidentified, said the National Institute for Standards and Technology, a branch of the Commerce Department, which published the report on Thursday.

In another test, black women were likelier than other groups to be falsely identified in a large database of mugshots maintained by the FBI — offering a glimpse of how the technology could be misused by law enforcement.

Read More