But with questions swirling about the technology’s accuracy — as well as its potential to be used for mass surveillance — the Port of Seattle, which runs Sea-Tac Airport, halted Delta’s plans. The port commission voted last month to enact a moratorium on new uses of facial recognition technology at port facilities — at least for the next few months, until the port can decide on new, comprehensive regulations.

Controversy over the use of facial recognition technology extends far beyond the airport. Increasingly, companies have been using facial recognition technology to help them choose which employees to hire, a process that some experts say can disadvantage nervous interviewees or nonnative speakers.

Meanwhile, civil rights groups are concerned about law enforcement officials using the technology to identify and track people who take part in political demonstrations or to target immigrants for deportation.

This month, the conversation about facial recognition technology will move to the Washington state Capitol. In Olympia, state lawmakers are gearing up for a lengthy debate about the appropriate use of facial recognition technology by government agencies, as well as by private businesses.

It’s bound to be one of the thornier issues on the agenda when the Legislature convenes Jan. 13. Similar discussions about regulating facial recognition technology fell apart at the Capitol last year, as legislators and others disagreed about how to enact adequate consumer protections.

Shankar Narayan, a privacy and technology consultant who co-chairs the city of Seattle’s Surveillance Advisory Working Group, said policymakers need to think about whether “this is a technology that can co-exist with our democracy.”

“Face surveillance is really a technology that supercharges the government’s ability to track people — not only by their location, but by their identity, in ways that are likely to chill constitutionally protected activity,” Narayan said.

During last month’s Seattle port commission meeting, Stan Shikuma, an officer with the Seattle chapter of the Japanese American Citizens League, recalled when the U.S. government used a combination of census data and surveillance information to round up Japanese Americans and incarcerate them during World War II.

The use of census data for that purpose was a departure from previous practice, fueled by wartime hysteria, Shikuma said. That's something he said officials would do well to remember today as they consider how to regulate new types of surveillance technology.

“Good intentions are not good enough,” Shukima said. “It really needs to be ironclad, because this can and could be violated in very serious ways.”

For critics, reports of facial-recognition technology’s failures to accurately distinguish between people of color are of paramount concern.

Some think that, at least until the technology improves, banning its use entirely is the best option.

However, key legislators looking to tackle the issue argue that a ban or moratorium won’t help root out bias in the technology, nor help monitor its use.

The facial-recognition discussion is part of a larger debate about data privacy, and what rights consumers should have to access and control their personal data online.

State Sen. Reuven Carlyle, D-Seattle, said he is reworking the data-privacy bill he proposed last year so that it would give consumers greater control over the data companies collect about them. As part of that effort, the new legislation will also seek to establish new rules for using facial recognition technology, he said.

Carlyle said his overall goal is to establish data-privacy protections like the ones that the European Union recently adopted. That means allowing consumers to know what data companies are collecting about them, and giving the consumers new options to correct that information or ask for it to be deleted, he said.

Those rights would extend to biometric data or facial scans collected through facial-recognition technology, Carlyle said.

“Ultimately people are finding themselves in databases and they don’t know it,” Carlyle said. “If they were captured and end up in a database through facial recognition technology, they should have the right to get themselves out.”