Washington’s Ministry of Preemption

On April 7, an odd-looking jet landed at Kadena Air Base on Okinawa, Japan. Codenamed Constant Phoenix, it was a U.S. Air Force version of a Boeing 707 but with round pods on the fuselage designed to “sniff” the atmosphere for radioactivity. Eight days later, across the East China Sea, North Korea would be celebrating the “Day of the Sun,” marking the 105th birthday of its founder, Kim Il Sung. And because many in the Donald Trump administration were concerned that the festivities would include a very big surprise — the country’s sixth nuclear test — Constant Phoenix was on alert. But when the celebrations ended, the surprise was on the Koreans, whose missile launch failed.

The unexpected has always been the enemy of intelligence. That’s why a small group of Ph.D.s and research scientists are employed by a secretive organization in the Maryland suburbs of Washington, D.C., to take the surprises out of intelligence: the spy world’s premier research center, the Intelligence Advanced Research Projects Activity (IARPA), which reports directly to the Office of the Director of National Intelligence.

For decades, from the first World Trade Center bombing to 9/11 to the recent Syrian poison gas attack, U.S. intelligence agencies have consistently been caught off guard, despite hundreds of billions of dollars spent on spies, eavesdroppers, and satellites. IARPA’s answer is “anticipatory intelligence,” predicting the crime or event before it happens.

Like a scene from Minority Report, the 2002 film where criminals are caught and punished by a “precrime” police force before they can commit their deeds, IARPA hopes to find terrorists, hackers, and even protesters before they act. The group is devising robotic machines that can find virtually everything about everyone and issue automatic “precrime” alerts.

That’s the idea behind the agency’s Open Source Indicators (OSI) program: Build powerful automated computers, armed with artificial intelligence, specialized algorithms, and machine learning, capable of cataloging the lives of everyone everywhere, 24/7. Tapping real-time into tens of thousands of different data streams — every Facebook post, tweet, and YouTube video; every tollbooth tag number; every GPS download, web search, and news feed; every street camera video; every restaurant reservation on Open Table — largely eliminates surprise from the intelligence equation. To IARPA, the bigger the data, the fewer and smaller the surprises.

If all this sounds familiar, it is. In 2002, the U.S. Defense Department created Total Information Awareness (TIA). Similar to IARPA’s OSI, TIA’s goal was to create a “virtual, centralized grand database” made up of unclassified, publicly available information. But following press reports and a public outcry, Congress killed it. However, the Pentagon secretly shifted some resources to the National Security Agency’s own research center, the Advanced Research and Development Activity (ARDA). Then, in 2007, ARDA quietly morphed into IARPA.

Even more troubling is IARPA’s secretive program Mercury, which focuses on data mining private communications collected by the NSA. Last year, for example, the agency collected more than 151 million phone call records involving Americans, according to a U.S. intelligence community report released May 2. Worldwide, the number is likely in the billions.

Like OSI, Mercury is outsourced to private contractors who develop computerized robots to scan the ocean of NSA intercepts for clues to potential terrorists, hackers, social unrest, and war. According to IARPA, “The Mercury program seeks to develop methods for continuous, automated analysis of SIGINT in order to anticipate and/or detect political crises, disease outbreaks, terrorist activity, and military actions.” The program manager for the Mercury project, Kristen Jordan, had previously worked at the NSA as the deputy national intelligence officer for signals intelligence connected to weapons of mass destruction.

To process such mammoth amounts of information, both open and secret, IARPA is racing to develop the world’s fastest computer, one capable of “beyond exascale” speeds — 1 quintillion (a million trillion) operations per second — program manager Marc Manheimer told the Next Platform, a news site that covers high-end computing. Under IARPA’s Cryogenic Computing Complexity program, the agency is focused on moving from traditional semiconductors to an energy-efficient superconducting supercomputer able to crunch data and break encryption at unimaginable speeds.

But collecting the data is useless without analysis, and that’s where the dangers of anticipatory intelligence and “precrime” policing are myriad and growing, with the shape of a subject’s face now the latest determinant of his or her likelihood to be or become a terrorist. That capability is, at least, the assertion of Faception, an Israeli company that says its software uses “advanced machine learning techniques” and “an array of classifiers” to “match an individual with various personality traits and types with a high level of accuracy.” Thus, according to the company’s website, its program can simply pick out the likely terrorists, pedophiles, and white-collar criminals from “video streams (recorded and live), cameras, or online/offline databases.”

To its credit, IARPA claims that the open-source data it collects is anonymized to protect privacy — but the group makes no mention of the NSA intercepts. Nevertheless, the hardware, software, and algorithms are already in place, and that administrative decision can be changed at any time by the Trump administration, which has shown little regard for privacy issues.

During his confirmation hearing last February, Dan Coats, the new director of national intelligence and the head of the office to which IARPA reports, expressed his support for the NSA’s warrantless overseas internet spying, which has also scooped up some domestic communications. The authority, contained in Section 702 of the Foreign Intelligence Surveillance Act, is due to expire in December, but Coats vowed to make reauthorizing it his “top legislative priority.” And, as a senator, Coats voted against the USA Freedom Act, the bill that prohibited the bulk collection of Americans’ phone records.

In Minority Report, the precrime program was shut down after the system was proved to be subject to manipulation. That plot provides a lesson for IARPA. In December 2016, Sean Kinion, a scientist working on a program for IARPA, was sentenced to 18 months in prison after pleading guilty to faking data.

Illustration by Matthew Hollister

This article originally appeared in the May/June 2017 issue of FP magazine.