Detectives in New York City have been abusing facial-recognition software by using photos of celebrities and portions of people's faces from Google searches to force the system into finding suspects across the city.

NBC News reported on a Georgetown Center on Privacy and Technology analysis, which found that the NYPD is violating the standards around the use of facial-recognition tools, one of the arrows in the quiver of artificial intelligence.

In one case, for example, detectives uploaded a photo of actor Woody Harrelson because the photo they had of a suspect was not high enough quality — but the suspect did look like Harrelson, the detectives believed. Using the actor's photo, a number of suspects turned up.

In other cases, detectives pasted lips from photos found on the internet onto mugshots so that suspects' mouths were not open in the images.

"It doesn't matter how accurate facial-recognition algorithms are if police are putting very subjective, highly edited or just wrong information into their systems," the report's author Clare Garvie told NBC. "They're not going to get good information out. They're not going to get valuable leads. There's a high risk of misidentification. And it violates due process if they're using it and not sharing it with defense attorneys."

Police across the country are now using facial-recognition software to identify suspects in various crimes. The tool compares images of potential suspects to things like driver's license photos, mugshots, and other images that are publicly available.

San Francisco voted this week to ban the government's use of facial-recognition software citywide. Moscow, on the other hand, will soon install the tool to use in tandem with the more than 160,000 surveillance cameras that keep watch over the city.