Lawmakers and civil liberties advocates might be pressing law enforcement agencies to scale back their use of facial recognition software, but international travelers should only expect to see more of the tech in the years ahead.

It’s been almost two years since Customs and Border Protection began deploying facial recognition systems at U.S. airports, and despite the recent backlash against the software, the agency’s efforts show no signs of slowing down. But if you ask Deputy Executive Assistant Commissioner John Wagner, the agency’s use of facial recognition falls far short of the dystopian panopticon feared by many of the tech’s critics.

“This is not a surveillance program,” Wagner, who heads CBP’s biometric entry and exit initiative, said in a conversation with Nextgov. “We are not just hanging a camera in an airport and randomly identifying people ... as they're walking through.”

Under Wagner’s program, CBP agents use facial recognition to compare real-time images of international travelers to the photos on their passports or visas. For arrivals, people have their faces scanned while officers review their travel documents, and for departures, the tech captures images right at the boarding gate.

Today, the tech is deployed in some capacity at 16 airports across the U.S, and by 2021, CBP expects to scale up the program to cover more than 97 percent of the people flying outside the country. Ultimately, officials anticipate biometrics could render physical boarding passes obsolete.

The system is intended to help agents keep better tabs who is entering and leaving the country. Instead of relying on traditional flight logs and manual document inspections to monitor international traffic, using the tech, agents can now verify passengers are who they claim to be with more than 98 percent accuracy in a matter of seconds, Wagner said. The agency is currently testing facial recognition at three checkpoints in Arizona to identify people crossing the U.S.-Mexico border.

And officials are already seeing program bear fruit. Since August, agents have intercepted six individuals trying to illegally enter the U.S. through airports and another 125 “imposters” along the southern border, a CBP spokesperson told Nextgov. Wagner said the agency also identified more than 14,000 people who left the country after overstaying their visas, a violation that could prevent them from returning to the U.S. for up to a decade. The program has been so successful that the Partnership for Public Service nominated Wagner for one of its annual Service to America awards.

Outside the Homeland Security Department, however, its reception has been mixed. The program came under fire last week on Capitol Hill as lawmakers and legal experts bashed law enforcement agencies for their often dubious use of facial recognition. During the hearing, Neema Singh Guliani, a senior legislative counsel at the ACLU, said she had “lots of questions and concerns” about CBP expanding the use of biometrics beyond airport terminals.

But compared to the sweeping and often covert applications of facial recognition by the FBI and other agencies, Wagner sees CBP’s operations as pretty tame.

People are always aware their picture is being taken, and U.S. citizens have the ability to opt out of face scans, at least for the departure process, he said. The tech is also only used in sections of the airport where people would already need to show identification, he said, and the image itself is only compared to passport and visa photos that already in the government’s possession.

“The biometric really becomes as simple as validating the information we've already received,” Wagner said. “There's no new information we're requiring of a person other than taking their photograph and comparing it to a photograph they've already given us.”

Instead of running images against a single trove of government IDs, the agency compares them to custom databases created for each individual flight, which significantly reduces the risk of misidentification, Wagner said. Those new airport photos are also deleted from CBP’s systems within less than a day, he added.

While he admitted some uses of facial recognition might be cause for concern, Wagner said the fears about CBP’s biometrics initiative are largely rooted in misconceptions about the program. In the years ahead, he said the agency is working to better explain the program to the public and “further [clarify] how this is all working.”

For privacy advocates, however, the problem isn’t how facial recognition is being used today, but rather how it might be used years down the line.

Jeramie Scott, director of the Domestic Surveillance Project at the Electronic Privacy Information Center, said he worries CBP and other federal agencies might start using the systems deployed in airports for other purposes, like identifying people who committed petty crimes. Today, there are no federal laws regulating agencies’ use of facial recognition, and without those restrictions, government officials and vendors have incentives to expand the scope of the program, Scott told Nextgov.

“It’s just a powerful surveillance tool ... with no rules in place,” he said. “There is potential for abuse because there’s not really the rules in place to make sure [the program] remains narrow.”

During last week’s hearing, Guliani also said that without more regulations in place, she feared the program could expand into something far more intrusive than identifying travelers.

While face scans are optional for U.S. citizens today, Scott worries CBP could eventually make it de facto mandatory for people to use the system, either by making alternative ID checks more laborious or explicitly requiring it for international travelers. Already, the Transportation Security Administration is planning to build off CBP’s work and stand up its own biometrics program for domestic travelers, and as the public gets more exposure to the tech, the agencies could continue expanding its reach in the name of safety, according to Scott.

“At the moment I think the program should be suspended until all of its privacy implications can be addressed and rules can be in place to ensure the program isn’t abused,” he said. “If the program goes forward, there needs to be a guarantee that people don’t have to participate and that they won’t be punished for not participating.”