Clearview AI's platform has raised a number of ethical concerns. Police can theoretically use photos and videos from security cameras to discover your online presence, including details you might not have realized were public. There's also evidence to suggest some (if not many) facial recognition algorithms have gender and race biases. And that's assuming the technology works as promised. Many security cameras capture footage at angles that make facial recognition difficult.

It's not certain that other law enforcement agencies and attorneys' offices will follow suit. In the past, police have credited the technology with assisting in some cases. They may be reluctant to ditch Clearview AI's system, even if it introduces the risk of false matches or exposes sensitive details.