One thing that makes ISIS so hard to fight is that the terrorist network is diffuse and scattered, with small cells of operatives all over the world. Not only does this make it hard for law enforcement to predict where the group might strike next; it makes it incredibly complicated to track activity on the network—activity like banking transactions. Small sums of money flow from foreign fighter to foreign fighter, yet banks struggle to identify it within their systems.

Banks have long used anti-money laundering systems to flag suspicious activity, and in the aftermath of September 11th, they have turned to those same legacy tools to catch terror-related transactions, too. But these legacy tools are not up to the job. They rely upon hard-coded “if-then” rules about predictably suspicious behavior. If the software spots a seven-figure transfer of funds from Miami to Bogota, for example, it knows to flag it. But as terrorist groups like ISIS recruit people internationally for smaller, targeted attacks, those tools become far less effective. There are just too many rules and possibilities to consider.

“It doesn’t take much to survive in a hostel in Belgium while waiting to be moved to another location,” says Dan Stitt, who's spent two decades in the financial crimes industry, with stints at the Drug Enforcement Agency and the Export-Import Bank of the United States. The pattern of small transactions a terrorist in hiding makes might not raise red flags for the usual anti-money-laundering systems.

Unless those systems use artificial intelligence.

Banks are increasingly turning to machine learning to mine vast quantities of bank data and find anomalies in accounts and transactions that might otherwise have gone unnoticed. “It’s a surgical approach to finding a needle in a haystack,” says Stitt, who now serves as director of financial crime analysis for the Wayne, Pennsylvania-based firm QuantaVerse, which developed the AI technology some of the world’s biggest banks use to identify money laundering, terrorist funding, and other financial crimes. The technology has already helped identify a Panamanian man the DEA called “one of the world’s most significant drug money launderers.”

The use of machine learning in this industry is still in its earliest days, and even QuantaVerse is unsure how many of its leads have actually turned out to be verifiable threats. But financial regulatory experts have high hopes for the potential of such tools. "Machines are able to take in multiple additional data points and analyze those data points in a way that may not seem obvious to human beings," says Kevin Petrasic, a partner at the law firm White & Case, who specializes in financial regulation.

Banks Must Help Find Criminals

Ever since the Bank Secrecy Act of 1970, banks have been required to assist government agencies in detecting money laundering. Software has helped automate that process somewhat. Yet, the process is beset by false positives, in which the system flags behavior that is not actually criminal. A recent Dow Jones survey of more than 800 anti-money laundering professionals found that nearly half of them said false positive alerts hurt their confidence in the accuracy of the screening process.

Still, to comply with governments, banks invest billions of dollars in these systems every year. “That’s billions invested—a lot of humans investigating the flags a legacy system will generate, and a large majority of those turn out not to be financial crimes," says David McLaughlin, who founded QuantaVerse in 2014. "Meanwhile, the real financial crimes are going unnoticed."

The challenge, particularly for banks looking to stop the flow of money to foreign fighters, is that there are infinite possible permutations of transactions to hand code into a rules-based system. A person looking to join ISIS might take $80 out of an ATM in Brussels, receive a wire transfer in Algeria, and use a credit card in Lebanon. He might take out a payday loan or transfer money to family. On their own, these incremental activities might not trigger suspicion, but taken together, they create a pattern that a machine might identify as fishy.

“Any investigator is going to go for the shiny object in front of them,” Stitt says. “If I have an alert for $1 million for a wire transfer to Mexico or a series of transactions for $80 in Belgium, what am I going to look at? That’s where the system has failed on an investigative level.”

Pattern Recognition

Unlike these traditional system’s QuantaVerse software learns these predictors on its own. The company’s team of data scientists trained its algorithms on several years’ worth of data from one of the top five biggest banks in the world, whose name the company is contractually prohibited from sharing publicly. With Stitt’s input, the team trained the system in what good and bad behavior looks like so that the system could begin learning and identifying that behavior without human oversight.