Can Big Data Really Help Screen Immigrants?

Enlarge this image toggle caption FrankRamspott/Getty Images/iStockphoto FrankRamspott/Getty Images/iStockphoto

Can an algorithm tell if you're a terrorist? Can it predict if you'll be a productive member of society?

U.S. immigration officials are trying to answer those questions. They hope to build an automated computer system to help determine who gets to visit or immigrate to the United States.

Immigration and Customs Enforcement, or ICE, wants to use techniques from the world of big data to screen visa applicants. The project would scour all publicly available data, including social media.

But the idea has some critics — including many tech experts — worried.

"This is creating this kind of open ended algorithm which has an enormous potential to be discriminatory," said Faiza Patel, co-director of the Liberty and National Security Program at the Brennan Center for Justice.

President Trump wants to develop an immigration system that admits people who, as he puts it, contribute to "national interests." And keeps out people who are likely to commit a crime or an act of terrorism. ICE, in turn, asked software companies if they could build a computer system that makes those "determinations via automation."

"Any self-respecting data scientist should run away screaming from this," says Cathy O'Neil. She's the author of Weapons of Math Destruction, about how algorithms devised by government and the private sector can reinforce bias. She's among more than 50 of the nation's top computer science experts who signed a letter urging the Trump administration to drop the plan.

Any self-respecting data scientist should run away screaming from this.

Algorithms need data from the past in order to make accurate predictions about the future, O'Neil said. The more data, the better the predictions. But compared to consumer transactions, terrorist attacks are extremely rare.

So O'Neil worries that this algorithm will make mistakes, without any accountability. That amounts to "a pseudo-scientific excuse to prevent a lot of perfectly good people from coming into our country as immigrants," O'Neil said. "And saying this is a scientific method. Since you're not an expert in science and math, you can't ask any questions. And you have to trust this."

Critics also say an algorithm couldn't possibly measure how a person might contribute to society. They think it would inevitably rely on something easier to measure, like income, that doesn't tell the applicant's full story.

Immigration officials say these concerns are overblown. They insist they are months, if not years, from actually building an algorithm like this. And they're backing away from the original idea of having a computer make the final determination on who gets a visa.

"That's just not the case," says Clark Settles, assistant director in charge of National Security Investigations at ICE, who oversees the project. "There's not going to be some A.I., artificial intelligence, making a decision on whether people can come to the country or whether they can stay here."

There's not going to be some A.I., artificial intelligence, making a decision on whether people can come to the country or whether they can stay here.

Settles says the government already collects lots of information about people applying for visas. What the department needs, he says, is an automated tool that will help human analysts sort through all that data so they don't miss anything. About a planned terrorist attack, for instance.

"If there there was information about that attack just sitting there in public forums, I would feel horrible if we hadn't done everything within the laws and rules to look at it," Settles said.

Companies are interested in building that tool. A number of software providers attended a conference hosted by ICE over the summer. Giant Oak was one of them — it's a data analytics firm in Virginia that already crunches data for the agency.

"Every day that technology gets better," said Giant Oak CEO Gary Shiffman. "We should be using the technology to help humans make better decisions."

ICE officials held another meeting with tech companies last month. But they've changed the name of the project as the controversy surrounding it grew. From "extreme vetting initiative" to "visa life-cycle vetting."