Business representatives are poised to release draft privacy guidelines for the use of facial recognition technology, a plan completed after privacy advocates walked out of the negotiations in protest.

The draft rules, which are expected to be presented on March 29 and will likely go through further revisions, would govern when companies need to tell customers that their faces are being analyzed by the technology.

ADVERTISEMENT

“There has been a bunch of really great work done by groups to help their members navigate the universe of facial recognition technology,” said Carl Szabo, policy counsel for industry group NetChoice, who is part of the smaller working group developing the draft.

“What we’re trying to do is take their good work and the work of everyone who has contributed so far and kind of expand it a little bit further to address public-facing uses of facial recognition technology.”

Facial recognition software could transform the way marketers target advertisements by providing them with a means of tailoring advertisements to customers. They can also combine facial recognition with data they gather from other sources, like social media.

Stores could use the technology to spot shoppers who are likely to spend more money or are otherwise important to the company.

Last year, Wal-Mart tested a system that scanned the image of everyone in some of its stores to notify security about possible shoplifters.

The technology also allows companies like Facebook to recognize users in photos.

But the use of the software has alarmed privacy advocates, who fear that consumers won't have control over when information is gathered about their appearance.

Having voluntary guidelines to follow could help the companies defend against harsher regulatory efforts.

The current draft of the rules, according to Szabo, primarily focuses on three areas: disclosure guidelines for companies using the technology, limits on how widely data from the technology can be shared and a requirement that businesses take steps to secure the data.

The guidelines would encourage businesses to notify people when their image is being analyzed by a facial recognition system and to put those notices in places where they will be seen.

“When facial recognition technology is being used, you need to tell people that it’s happening,” Szabo said.

“The idea is, if I’m Macy’s, I can’t put the notice on the roof of the bathroom stall,” he said.

But he also said that, for fear of being overly prescriptive, the guidelines are unlikely to dictate exactly where businesses would be required to notify customers.

Businesses would also be advised to seek users' consent before sharing information from their facial recognition databases, Szabo said.

The guidelines also suggest that businesses using facial recognition software to take “reasonable measures” to keep their customer data safe, without specifying how, Szabo said.

The guidelines would still need to be voluntarily adopted by businesses. Companies such as Facebook and Microsoft have joined industry groups at the talks.

Businesses are forging ahead with the guidelines even though they will be the product of a fraught process that privacy advocates abandoned last summer as tensions boiled over.

The National Telecommunications and Information Administration (NTIA), part of the Department of Commerce, first convened the talks in 2014. The plan was to bring industry representatives and privacy activists together to reach a consensus, a process that the agency was already conducting for mobile applications.

The talks initially included some of the most prominent privacy groups in the country, including the Electronic Frontier Foundation and the American Civil Liberties Union. Over time, the advocates became more skeptical of industry’s motivations.

“We started getting the sense that a lot of the representatives on the other side weren’t there to cut a deal and were instead there to stop a deal from happening,” said Alvaro Bedoya, the executive director of the Center on Privacy and Technology at Georgetown University Law Center.

So, after an exchange at a June meeting over when users should be required to consent to the use of facial recognition software, the privacy advocates told organizers they wouldn’t be coming back. Instead, they together wrote a statement that said they did "not believe that the NTIA process is likely to yield a set of privacy rules that offers adequate protections for the use of facial recognition technology.”

Bedoya said the focus on disclosure and data sharing would obscure the larger question of whether companies need someone’s permission to run their image through facial recognition programs in the first place.

“This is all a way of hiding the actual ball, which is consumer control,” he said. “And at the end of the day, that’s what the vast majority of Americans want: the ability to control their information.”

Szabo, defending the guidelines, said that, “by giving consumers and individuals notice and transparency, people can vote with their feet.”

“And if they see that a store is using this technology and they don’t like it, then go somewhere else,” he said.

In a statement, an NTIA spokeswoman defended the process, which the agency is also using to develop a privacy framework for the use of drones.

“Multistakeholder processes are stronger when all parties participate,” she said. “However, it is important to ensure that no stakeholder or group of stakeholders can shut down a process when others want to make progress.”

The privacy groups say they are unlikely to return to the talks.

“We’re not planning on reengaging unless there’s reason to believe that there’s a greater willingness to endorse essential privacy principles,” said the ACLU’s Neema Singh Guliani in a statement.

Bedoya said that “the main question is will people incorrectly perceive this to be a document that’s endorsed by the privacy community, and I think the answer’s no.”

“And if the document comes out and, as we expect it will, does little to protect privacy, we’ll make that eminently clear.”