Microsoft reportedly funded an Israeli startup that makes facial recognition used to secretly monitor Palestinians living in the West Bank.

The investment, first reported by Forbes, seems to conflict with Microsoft's public pledge not to use facial recognition technology when it impinges on democratic freedoms.

Investigations by NBC News and Haaretz reveal the extent of the facial recognition startup's technology in surveilling Palestinians, with multiple former employees raising ethical concerns.

The firm also reportedly tested out experimental facial recognition functions on Palestinians, marketing the results to other clients across the globe.

AnyVision denied that NBC News' reporting is accurate in a statement to Business Insider.

Visit Business Insider's homepage for more stories.

As public scrutiny builds over facial-recognition technology and its impact on privacy and civil liberties, Microsoft has positioned itself as a moral leader, publishing six guiding principles on facial recognition and pledging never to use the tech in a way that threatens "people's democratic freedoms."

But Microsoft's investments in a controversial Israeli facial recognition startup is raising questions about its commitment to those principles.

Microsoft announced in June that its venture capital arm would join a $78 million Series A funding round for AnyVision, an Israeli company that makes facial recognition technology. The investment drew immediate backlash from human rights advocates including the ACLU, Forbes reported.

Since then, investigations by NBC News and Haaretz have shined more light on AnyVision's operations, describing how the firm's software is being used by the Israeli military to secretly conduct mass surveillance of West Bank Palestinians.

AnyVision has publicly said that its technology is used by the Israeli military at border crossing checkpoints, where it logs the faces of Palestinians crossing into Israel. But according to the news reports, the AnyVision technology is also secretly used in tandem with a network of cameras throughout the West Bank that the Israeli government uses to monitor the movement of Palestinian residents as part of its efforts to prevent potential terror attacks.

West Bank Palestinians are not allowed to vote in Israeli elections but face surveillance and law enforcement by the Israeli government.

The company uses the result of the facial recognition software being secretly tested in the West Bank to market the technology to other clients across the globe, according to the NBC report.

Former employees of AnyVision told NBC News that the company did not seemingly act in accordance with Microsoft's ethical standards. "Ultimately I saw no evidence that ethical considerations drove any business decisions," a former employee told the network.

In a statement to Business Insider, AnyVision disputed NBC's reporting, claiming that the company does not use facial recognition for surveillance in the West Bank or Gaza.

"At AnyVision, we believe it is our duty to ensure our technology and products are used responsibly to benefit the safety of society. We do not, and will not, tolerate unlawful or unethical usage of our technology," AnyVision CCO Max Constant said in a statement.

A Microsoft spokesperson told Business Insider that it will conduct an audit to determined whether AnyVision has violated its principles.

"Microsoft takes these mass surveillance allegations seriously because they would violate our facial recognition principles. AnyVision has confirmed their compliance with our principles, and we are engaging a highly experienced, outside law firm to conduct an audit. We also asked AnyVison to implement a robust board level review and compliance process. AnyVision has agreed to both. If the audit discovers any violation of our principles, we will end our relationship," the Microsoft spokesperson wrote in an emailed statement.

Microsoft's principles on facial recognition include a pledge that the company "will advocate for safeguards for people's democratic freedoms in law enforcement surveillance scenarios and will not deploy facial recognition technology in scenarios that we believe will put these freedoms at risk."