They call Amazon the everything store—and Tuesday, the world learned about one of its lesser-known but provocative products. Police departments pay the company to use facial-recognition technology, which Amazon says can “identify persons of interest against a collection of millions of faces in real-time.”

More than two dozen nonprofits wrote to Amazon CEO Jeff Bezos to ask that he stop selling the technology to police, after the ACLU of Northern California revealed documents to shine light on the sales. The letter argues that the technology will inevitably be misused, accusing the company of providing “a powerful surveillance system readily available to violate rights and target communities of color.”

The revelation highlights a key question: What laws or regulations govern police use of the facial-recognition technology? The answer: more or less none.

State and federal laws generally leave police departments free to do things like search video or images collected from public cameras for particular faces, for example. Cities and local departments can set their own policies and guidelines, but even some early adopters of the technology haven’t done so.

Documents released by the ACLU show that the city of Orlando, Florida, worked with Amazon to build a system that detects “persons of interest” in real time using eight public-security cameras. “Since this is a pilot program, a policy has not been written,” a city spokesperson said, when asked whether there are formal guidelines around the system’s use.

“This is a perfect example of technology outpacing the law,” says Jennifer Lynch, senior staff attorney at the Electronic Frontier Foundation. “There are no rules.”

Amazon is not the only company operating in this wide open space. Massachusetts-based MorphoTrust provides facial-recognition technology to the FBI and also markets it to police departments. Detroit police bought similar technology from South Carolina’s Data Works Plus, for a project that looks for violent offenders in footage from gas stations.

The documents released Tuesday provide details about how Orlando, and the sheriff’s department of Oregon’s Washington County use Amazon’s facial-recognition technology. Both had previously provided testimonials about the technology for the company’s cloud division.

Orlando got free consulting from Amazon to build out its project, the documents show. In a prior testimonial, Orlando’s chief of police, John Mina, said that the system could improve public safety and “offer operational efficiency opportunities.” However a city spokesperson told WIRED, “This is very early on, and we don't have data to support that it does or does not work.” The system hasn’t yet been used in investigations, or on imagery of members of the public.

Washington County uses Amazon’s technology to help officers search a database of 300,000 mug shots, using either a desktop computer or a specially built mobile application. Documents obtained by the ACLU also show county employees raising concerns about the security of placing mug shots into Amazon’s cloud storage, and the project being perceived as “the government getting in bed with big data.”

There’s no mention of big data in the US Constitution. It doesn’t provide much protection against facial recognition either, says Jane Bambauer, a law professor at the University of Arizona. Surveillance technology like wiretaps are covered by the Fourth Amendment protections against search and seizure, but most police interest in facial recognition is in applying it to imagery gathered lawfully in public or to mug shots.

State laws don’t generally have much to say about police use of facial recognition, either. Illinois and Texas are unusual in having biometric privacy laws that can require companies to obtain permission before collecting and sharing data such as fingerprints and facial data, but they make exceptions for law enforcement. Lynch of the EFF says hearings by the House Oversight Committee last year showed some bipartisan interest in setting limits on law enforcement use of the technology, but the energy dissipated after committee chair Jason Chaffetz resigned last May.