From Mounties and small-town cops to a Rexall employee hoping to bust shoplifters — a facial recognition tool at the centre of international controversy has been used far more widely in Canada than had been previously known.

On Thursday, a dozen police forces and private businesses confirmed their officers or employees had used the technology, including nine forces from across the country that had previously told the Star they did not use Clearview AI. The U.S. company uses artificial intelligence to match people’s images against a database of billions of photos scraped from the internet, including social media sites, and has been called “reckless,” “invasive,” and “dystopian” by critics.

Privacy regulators opened an investigation into the legality of the company’s practices last week, and on Thursday evening the federal privacy commissioner announced it was launching an investigation into the RCMP’s use of the technology.

Among the newest known users are the Via Rail Police Service, cops in Halifax, Cornwall and London, and investigators with the Insurance Bureau of Canada.

All but the Mounties used free trial versions of the tool. Many police forces said that members had signed up without the knowledge or authorization of leadership, who were unaware officers were using Clearview AI until the Star’s inquiries prompted further digging.

The revelations were prompted by data obtained by BuzzFeed News. According to the data, which was shared exclusively with the Toronto Star, at least 34 police forces across Canada have obtained log-ins and searched Clearview AI’s database in recent months.

The widespread, often apparently unsanctioned use of Clearview AI shows how far the adoption of powerful artificial intelligence technology like facial recognition has outpaced regulation and oversight.

It also shows how rapidly the controversial technology has spread globally. BuzzFeed reported Thursday that 2,900 institutions in 27 different countries have collectively made about 500,000 searches through the company’s service.

According to the data, Canada is Clearview’s largest market outside of the U.S.

Clearview AI did not respond to repeated requests from the Star for comment on Thursday. A lawyer for the New York-based startup told other media a “flaw” gave someone unauthorized access to the company’s client list.

“Security is Clearview’s top priority,” Tor Ekeland told other media. “Unfortunately, data breaches are part of life in the 21st century. Our servers were never accessed. We patched the flaw and continue to work to strengthen our security.”

In a statement to BuzzFeed News, Ekeland said: “There are numerous inaccuracies in this illegally obtained information. As there is an ongoing Federal investigation, we have no further comment.”

It’s not clear what investigation Ekeland was referring to. Last Friday, Canadian regulators launched an investigation into whether the company violates privacy laws in this country.

“I think it is very questionable whether it would conform with Canadian law,” Michael McEvoy, British Columbia’s privacy commissioner, told the Star last week.

After previously refusing to confirm whether it used Clearview AI, the RCMP released a statement Thursday saying it had done so for months.

The RCMP’s statement said it was providing confirmation “in the interest of transparency.” The disclosure came a day after the Star approached the national police force for comment about data showing its extensive, paid use of the tool. According to the data obtained by BuzzFeed, officers have run more than 450 searches since last October.

The Mounties’ statement comes after the force made a commitment to the federal privacy commissioner in late January, saying it would submit an assessment of potential privacy impacts before deploying facial recognition technology of any kind, according to a spokesperson for the watchdog’s office.

“We were surprised by the RCMP’s news release today,” the spokesperson said Thursday. “In light of the RCMP’s acknowledgment of their use of Clearview’s facial recognition technology, we are launching an investigation. Given we are now investigating, no further details are available at this time.”

The RCMP’s statement Thursday said its child exploitation investigations unit had two licences for the app, and investigators have used it in 15 cases, resulting in the successful identification and rescue of two children. The national police force also said that “a few units” tested the app in criminal investigations.

Among the private businesses in the data obtained by BuzzFeed is Rexall, a pharmacy chain with 415 stores in Canada. A spokesperson for Rexall confirmed Thursday that the company’s “loss prevention unit” received a trial of the software from a contact at the Toronto Police Service.

“During the trial period, an employee used the software to search approximately seven individuals who were suspected of shoplifting from Rexall stores,” spokesperson Andrew Forgione said.

“No action was taken from the searches, and Rexall has discontinued use with no plans to reactivate,” Forgione said, adding that the trial ended 10 days ago.

According to the data obtained by BuzzFeed, Toronto police was the most prolific user of the app in Canada, with more than 3,400 searches in recent months.

Loading... Loading... Loading... Loading... Loading... Loading...

Toronto police has said its officers had been “informally” testing Clearview AI since October without the knowledge of police Chief Mark Saunders. The service says it has stopped using the app.

“Toronto Police Service has been engaged in a comprehensive review of its use of Clearview AI,” spokesperson Meaghan Gray said.

“Without knowing or being able to verify the information the Star possesses, we can’t comment on it with any certainty. However, we can say that questions such as access and timing will be included in, and released in the findings of our review,” she said.

The nearly three dozen Canadian police forces in the data obtained by BuzzFeed include services from British Columbia to Nova Scotia; Ontario has the most of any province, with 21.

Nine police services that earlier in the week said they did not use Clearview AI subsequently confirmed they had done so after the Star asked why their members were listed in the data. Those services each said officers had signed for the technology without the force’s knowledge.

“Based on the information you provided us, we have identified Regina Police Service employees who have previously provided their work email addresses to Clearview AI,” a spokesperson for the Regina Police Service said on Thursday. “Consistent with inquiring about all types of law enforcement products, services and tools, these employees have provided their email information only to learn more about Clearview AI.”

“It has come to our attention that well intentioned officers may have accessed the technology that has been marketed and shared as a possible investigative tool amongst the law enforcement community,” a spokesperson for the London Police Service said in an email Thursday.

“Regardless of intention, it is recognized that this is very complex issue that engages broad public policy concerns with respect to privacy interests,” she said, adding that London’s police chief has since ordered that any use by officers now cease.

“The senior leadership of the Cornwall Police Service had no prior knowledge until this morning that the tool had been used by members of the service,” a spokesperson for the eastern Ontario town’s police force said.

“Since becoming aware this morning, the Cornwall Police Service has immediately prohibited any usage of Clearview AI or any other facial recognition technology by our members,” said the spokesperson, adding that the force does not plan to use facial recognition “due to privacy concerns.”

Many police forces said their officers had either used the app on a trial basis and have since stopped or that senior leadership ordered that all testing cease. Ontario’s privacy commissioner demanded this month that all Ontario police forces stop using Clearview AI “immediately” and contact his office if they had.

Via Rail said that its police “took advantage of a free trial service using some team members’ photos (with permission). They learned more about the technologies and methodologies involved, and assessed the service. They concluded over the course of our review to not pursue ongoing access to this technology.”

A spokesperson the Insurance Bureau of Canada said that “our investigators were offered and took a demonstration of the product by Clearview AI. IBC chose not to move forward with the product.”

A Halifax police spokesperson said: “We have looked into this further and found that a very limited number of specialized investigators have used this application very recently.” After learning about the data breach Thursday, “our use of the application has been suspended.”