Whistleblower Explains How Cambridge Analytica Helped Fuel U.S. 'Insurgency'

Enlarge this image toggle caption Gabby Laurent/Penguin Random House Gabby Laurent/Penguin Random House

When Christopher Wylie first began working for the British behavioral research company SCL Group, the company used data drawn from a number of sources as a means of potentially altering outcomes for its, sometimes military, clients.

But over time, Wylie's mission — and that of the company — expanded. Conservative strategist Steve Bannon, who later worked in President Trump's White House, became involved with the SCL subsidiary Cambridge Analytica. Wylie, who served as Cambridge Analytica's research director for a year and a half, watched as his group began to use of data from Facebook and other online sources to target users for disinformation campaigns.

"They targeted people who were more prone to conspiratorial thinking," Wylie says. "They used that data, and they used social media more broadly, to first identify those people, and then engage those people, and really begin to craft what, in my view, was an insurgency in the United States."

Wylie adds: "The things that I was building on originally for the defense of our democracies had been completely inverted to really, in my view, attack our democracies."

In 2014, Wylie resigned from Cambridge Analytica. He later became a whistleblower, exposing the company's role in President Trump's presidential campaign and Brexit. He also revealed the company's links to Russia.

Wylie's new book, Mindf*ck, explains how Cambridge Analytica harvested the information of tens of millions of Facebook users, then used the data to target people susceptible to disinformation, racist thinking and conspiracy theories. Though Cambridge Analytica no longer exists, Wylie warns that the company's tactics continue to be a threat to democracy. He notes that some of its former employees are currently working on the next Trump campaign.

"One of the reasons I wrote the book is to serve as a warning, particularly to Americans," he says. "We have a completely unregulated digital landscape. There is almost no oversight. We are placing blind trust in companies like Facebook to do the honorable and decent thing. ... Even if Cambridge Analytica doesn't exist anymore, what happens when China becomes the next Cambridge Analytica?"

Interview Highlights

On the research Wylie initially did for SCL

I got recruited to join a research team at SCL group which, at the time, was a British military contractor based in London. Most of its clients were various ministries of defense in NATO countries. And what we were looking at is how to use data online to identify people who would be likely targets of different extremist groups. And from that, try to understand and unpack: How would a fairly extreme ideological message spread through different kinds of social networks? And what could we do in order to mitigate its effectiveness? When Steve Bannon got introduced to the company, he realized that a lot of that work could be inverted. And rather than trying to mitigate an extremist insurgency in certain parts of the world, he wanted to essentially catalyze one in the United States.

When Steve Bannon got introduced to the company, he realized that a lot of that work could be inverted. And rather than trying to mitigate an extremist insurgency in certain parts of the world, he wanted to essentially catalyze one in the United States.

On Steve Bannon's role in Cambridge Analytica

He found us in London. He convinced a billionaire [Robert Mercer] to acquire the company, and then he transformed that company into a set of tools that he would be able to use to, in effect, manipulate a certain segment of the American voter population.

When Steve Bannon took over, he wasn't just concerned about particular elections. He followed this notion of the Breitbart doctrine, which is that politics exists downstream from culture. So don't just focus on the day-to-day politics. Try to actually make an impact on an enduring change in culture, because politics will just flow from that.

On how SCL and Cambridge Analytica approached personality profiling

Mindf*ck Cambridge Analytica and the Plot to Break America by Random House Hardcover, 269 pages | purchase close overlay Buy Featured Book Your purchase helps support NPR programming. How?

Originally, when we were looking at this for defense purposes, we wanted to figure out ... what were the psychological characteristics of those people that would make them more prone and more vulnerable to certain kinds of [extremist] messaging, so that we could engage them beforehand? That was based on a series of studies, many of which came out of the University of Cambridge, that looked at essentially how, particularly with Facebook data, you can quite accurately predict a person's personality profile. And from that, if you can understand how a person thinks and feels and engages in the world, and what kinds of biases they have, you can then figure out what's going to be most effective at engaging them in a particular objective — originally in some kind of counter-extremism or mitigation strategy.

Later, when it became Cambridge Analytica, it essentially became identifying people in the same way that you'd be looking for people who'd be more vulnerable to ISIS messaging — people who were more prone to conspiratorial thinking or paranoid ideation. Effectively, it looks for the same kinds of people. But rather than discouraging them from joining ISIS, it would be to encourage them to join the alt-right.

On how Cambridge Analytica collected data

When the story blew up, one of the things that people often talked about is how it was a hack of Facebook, or some kind of data breach. And what actually happened was that Facebook authorized the applications that Cambridge Analytica ended up using to access the data. The company engaged professors at the University of Cambridge to create an application that then got put onto Facebook where people would go and fill out personality inventories, like surveys about who they are and their attributes. But the way the app worked was that they wouldn't just harvest the data of the person who responded to that survey, but it would go into their profile and look at all of their friends and harvest all of their friends' data as well.

So when you had one person fill out a survey, by default they effectively consented by proxy for hundreds of other people, simply because they were Facebook friends with them. So that scaled really quickly. And at the time, the way Facebook worked, they allowed applications to have that feature. They've since turned it off, and rightfully so, but at the time, you could acquire a lot of data really quickly.

On why people filled out the surveys that Cambridge Analytica used to harvest data

Different kinds of people have different motivations for filling out surveys. Sometimes you would have a group of people who just would fill it out because they're bored, and they don't have anything to do. Or they would just genuinely want to know what is their personality. Or if you had apps that were, "If you were on 'Game of Thrones,' who would be your character? Fill out this survey, and find out." Little do they know that actually it's taking all of that information and porting it over to an alt-right campaign. But in some cases with certain groups of people that were underrepresented in those samples, they would be paid one dollar, two dollars, around there, to fill out a survey. And people do a lot for a dollar, which is actually quite surprising.

On his decision to resign from Cambridge Analytica

It was a gradual process, sort of like boiling a frog in a way. Things change and you don't necessarily notice how much things have changed when you're inside of something. ... But really, one of the things that I remember is seeing videos of people from focus groups and events that Cambridge Analytica was doing who had been targeted and sort of massaged online into believing certain kinds of conspiracies. And just to see like the rage in their eyes — how angry these people were, how they started engaging in highly racialized thinking. ... To see their faces and what that looks like, what a manipulated person looks like, for me, it was really eye-opening.

Sam Briger and Joel Wolfram produced and edited this interview for broadcast. Bridget Bentz, Seth Kelley and Meghan Sullivan adapted it for the Web.