Smith said Microsoft rejected the contract due to human rights concerns -- it believes the technology's use for that particular purpose could lead a disproportionately large number of women and minorities being held for questioning. Face recognition systems still struggle with gender and race bias, because they're mostly trained on photos of white male subjects. As a result, they're more likely to misidentify women and persons of color. That said, the tech giant has been working on improving its technology's capabilities across skin tones and gender.

The company president made the revelation at a Stanford University human-centered artificial intelligence conference. While the company did sell its technology to an American prison after determining that its use in such an environment would be limited, Smith said Microsoft turned down a contract offered by an unnamed country. The nation, which democracy watchdog Freedom House didn't deem "free," wanted Microsoft to install face recognition on the cameras keeping a close eye on its capital city.

Smith's explanation for the company's decision echoes its reasons behind its call to regulate the technology. He told the Congress last year that as the technology of the moment, facial recognition has "broad societal ramifications and potential for abuse."