At a time when Big Tech is under attack for cooperating with military and police, Amazon is facing scrutiny for marketing its surveillance software to U.S. immigration officials.

Driving the news: Amazon has confirmed that it met with Immigration and Customs Enforcement in Silicon Valley over the summer to discuss Rekognition, its controversial facial recognition software. Emails obtained by a watchdog group show that Amazon followed up in an apparent attempt to close a sale of the technology.

Why it matters: Amazon’s entanglements with ICE are becoming increasingly public. Just yesterday, a report published by several advocacy groups linked Amazon’s cloud technology to the agency’s work.

Hundreds of Amazon employees have reportedly signed a letter calling on the company to stop selling Rekognition to police, and an Amazon employee published an anonymous op-ed last week calling for the same.

The internal revolt resembles an uprising inside Google that pushed the company away from a controversial contract with the Defense Department.

Details: The June meeting was first reported by the Project on Government Oversight in The Daily Beast, and Amazon confirmed in a statement that the meeting took place.

The two organizations met at a Bay Area office of McKinsey & Company, which at the time had a contract with ICE, POGO reported.

"Arming ICE with real-time facial recognition surveillance technology could supercharge the agency’s enforcement power, and make undocumented immigrants afraid to seek out vital services in places where cameras could be located," wrote POGO's Andrea Peterson and Jake Laperruque.

What they’re saying:

In a statement to Axios, an Amazon spokesperson confirmed that Amazon was one of several tech companies that participated in a McKinsey-organized "boot camp."

An ICE spokesperson declined to say how many times the agency discussed Rekognition with Amazon, but said that there is no contract between the two for facial recognition software.

POGO said that one concern is that facial recognition systems have been susceptible to gender and racial biases, often identifying women and people of color less accurately than men and white people.

This summer, the American Civil Liberties Union tested Amazon Rekognition and found that it misidentified 28 members of Congress, including six members of the Congressional Black Caucus, confusing them with people in a database of mugshots.

The company hit back in a blog post, writing that the ACLU had not used its software as intended.

Other law-enforcement agencies have already tried out Rekognition. An Oregon sheriff's department began using the software in 2017, and the Orlando Police Department in Florida extended a test of it in July.