Chris Monroe/CNET

When Amazon's controversial facial recognition system mistakenly matched 28 members of Congress with criminal mugshots, it caused quite a stir. Particularly among some of those congressmen, who demanded immediate answers.

Now, Amazon has published an official blog post disputing the results of that test -- one which includes a rather interesting message at the very end.

A message that suggests that Amazon, like Microsoft, may believe that the US government should weigh in on facial recognition -- that it's not Amazon's job to police. But also a message that may require me to explain a few things.

ACLU

On Thursday, Amazon had already told CNET that it didn't believe the American Civil Liberties Union's (ACLU) test of its Rekognition facial recognition service was completely fair -- in no small part because the ACLU used the tool at its default 80-percent confidence-level setting, as opposed to the 95-percent-plus setting Amazon recommends to law enforcement agencies.

(In plain English: Even if Amazon's system was only 80 percent sure that a Congressman looked like a criminal, it'd still say so when tested this way, unless you changed the setting.)

The ACLU countered that police might easily do the same thing they did: 80 percent is the default setting, there's no instructions to tell a law enforcement agency to use a different setting -- and besides, the ACLU just plain believes facial recognition software is too easy for governments to abuse.

(You can read a more detailed explanation here.)

But on Friday, Amazon's new memo disputes that the company doesn't provide instructions -- Amazon claims it has documentation that specifies a 99-percent confidence level. (CNET does see a mention of 99 percent in Amazon's documentation, but it's only a passing idea -- not exactly a clear instruction.)

Enlarge Image Screenshot by Sean Hollister/CNET

The memo also points out that Amazon's facial recognition is actually rather good, judged by industry standards, if it only matched 28 out of 507 members of Congress. False positives are generally accepted in facial recognition software, because humans pick up where machines leave off.

OK, now here's the weird part. On Friday afternoon, Amazon pulled that memo off the internet -- and replaced it with one that added a new sentence at the bottom:

"It is a very reasonable idea, however, for the government to weigh in and specify what temperature (or confidence levels) it wants law enforcement agencies to meet to assist in their public safety work," the memo now reads.

Screenshot by Sean Hollister/CNET

That can be read at least a few different ways:

1) Amazon believes the US government should tell Amazon directly (not through regulation) how it believes its software should work.

2) Amazon believes the government should tell its own law enforcement agencies which confidence level to use when using facial recognition. (Basically, passing the buck.)

3) Like Microsoft, Amazon believes the government should regulate facial recognition.

We're not quite sure which one's accurate, but a source close to the matter says that no. 2 is the right one -- Amazon thinks the government should make sure it's using the tool carefully.

Whichever way, the ball seems to be in Congress's court.

In a statement, the ACLU wrote: "In its five stages of grief over its dangerous face surveillance product, Amazon is clearly stuck at denial. In a matter of 48 hours, Amazon has gone from its own system default of an 80 percent match rate to saying yesterday it should be 95 percent, and then saying today it should be 99 percent. At no time has Amazon taken any responsibility for the very grave impact that their face surveillance product has on real people."

Amazon provided CNET with the following reply on Monday:

"The ACLU continues to distort the facts to suit [its] purposes. The facts are that Amazon Rekognition has helped organizations find missing children, fight human trafficking, reunite missing children with their families, reduce fraud for mobile banking, improve the impact of clinical trials, and help develop new treatments for autism -- and not once, been involved in any actual public safety mishap. When Rekognition is used as recommended for public safety (with 99 percent confidence levels), the same reports that the ACLU claimed contained 5 percent error rates yielded 0 percent error rates. This is inconvenient for the ACLU's rhetoric, but these are also the facts."

Originally published July 27.

Update, 3:38 p.m. PT: Added ACLU statement.

Update, July 30 at 11:30 a.m. PT: Added Amazon's reply to the ACLU.