Microsoft President Brad Smith told Business Insider that it would be "cruel" to stop government agencies from using facial recognition software.

It follows activists writing to Microsoft, Amazon, and Google last month demanding they stop selling facial recognition software to the public sector.

Smith said prohibiting sales could halt good work, such as diagnosing rare diseases.

He has, however, called for regulation of facial recognition tech to prevent things like bias and discrimination.

A top Microsoft executive has said that stopping government agencies from using facial recognition software would be "cruel in its humanitarian effect."

More than 85 human rights groups wrote to Microsoft, Amazon, and Google last month demanding they stop selling facial recognition software to the public sector, fearing it will lead to government surveillance.

Business Insider asked Brad Smith, Microsoft's president and chief legal officer, about the letter at the World Economic Forum in Davos.

He strongly rejected the idea that government agencies, including law enforcement, should step back from the technology.

"I do not understand an argument that companies should avoid all licensing to any government agency for any purpose whatsoever," he told Business Insider. "A sweeping ban on all government use clearly goes too far and risks being cruel in its humanitarian effect."

Read more: Amazon investors are cranking up the pressure on Jeff Bezos to stop selling facial recognition tech to government agencies

Smith referenced the fact that the National Human Genome Research Institute is using facial recognition to improve the diagnosis of DiGeorge syndrome, a rare, genetic disease, in Africans, Asians, and Latin Americans. Healthcare providers have conventionally struggled to pinpoint the disease in diverse populations.

"These are disorders that result in heart failure or kidney problems. Why should we stop a government from helping identify patients who need medical care?" Smith asked.

He also pointed to reports last year that police in New Delhi, India, was using facial recognition software to try and track down 5,000 missing children. The success of that project was questioned this week, however, by the Delhi High Court, which said it was "unacceptable" that the software "has not borne any results," according to local reports.

Brad Smith, Microsoft's president and chief legal officer. Pedro Fiúza/NurPhoto via Getty Images

Smith has been among those calling for better regulation of facial recognition technology. He urged regulation of government use covering three areas: Bias and discrimination, people's privacy, and democratic freedoms and human rights.

"There are certain uses of facial recognition that should cause concern and should cause everyone to proceed slowly and with caution. That's certainly what we're doing and we're very worried about situations where facial recognition technology could be used in a manner that would cause bias or discrimination," Smith explained.

"We're worried about certain scenarios by law enforcement or by governments in certain countries that you don't fully respect human rights. So we put in place principles and we put in place steps so that we don't license this technology in ways that we or the world would come to regret."

In a blog post last year, Smith laid out Microsoft's principles for how it would self-govern its facial recognition work. Smith said Microsoft will document the capabilities of the technology, and prohibit its use to engage in unlawful discrimination.

"Tech companies need to act proactively because we can't expect the whole world to respond to this call to action. So we need to put in place principles ourselves," he told BI.