Why did Cambridge Analytica — the analytics and marketing firm that worked for the Trump presidential campaign — want to know what as many as 87 million people “liked” on Facebook?

The answer, as we’ve learned, is that the firm, which obtained the personal data of tens of millions of users without their consent, sought to develop “psychographic” profiles of them. In simpler terms, Cambridge Analytica hoped to leverage Facebook data to understand the personalities of its users and then, in turn, use that information to match advertisements with the people most likely to respond to them.

As the New York Times reports, it’s unclear to what extent the company used this “psychographic” targeting to influence the 2016 general election campaign. During the primary, Cambridge Analytica worked for Ted Cruz, whose staff was unimpressed with the firm’s tactics and technical know-how.

And it’s even more unclear if this approach could be effective, if executed perfectly. So, we’re left wondering: Is it plausible that personality data could be used to influence an election?

The research from psychology and political science actually has a few hints for us here. Overall, the story is this: Our digital footprints do paint a somewhat accurate picture of our personalities. However, it remains to be seen how political campaigns can leverage this information to actually influence elections.

Here are the four takeaways from psychological and political science research.

1) It is entirely plausible that people can build a psychological profile of you from your online activity

We all have personalities; they are the stable personal traits that inform how a person interacts with the world. For instance, a neurotic person may be more swayed by an ad depicting a home break-in, Cambridge Analytica CEO Alexander Nix once explained to potential clients. A more agreeable person may respond better to an ad that emphasizes family values. (The German Das Magazin first reported on Cambridge Analytica’s tactics in 2016. You can read the English version of the article at Vice.)

Personality is most commonly measured on a scale known as the “Big Five.” It’s based on five well-established traits: agreeableness, neuroticism, openness to new experiences, extroversion, and conscientiousness. Our levels on these traits tend to be relatively stable throughout our lifetime (with some exceptions).

And it’s no big mystery how levels of these traits are related to the things we like, buy, and spend time with.

In 2013, psychological researchers Michal Kosinski, David Stillwell, and Thore Graepel, who were all researchers at Cambridge University in the UK at the time, published a paper that showed the things we “like” on Facebook can be used to predict our personality traits.

“For example,” they write in the paper, “users who liked the ‘Hello Kitty’ brand tended to be high on ‘Openness’ and low on ‘Conscientiousness,’ ‘Agreeableness,’ and ‘Emotional Stability [i.e., neuroticism].’”

Kosinski and his colleagues had come to these conclusions by deploying an app on Facebook. Once users opted in, the app assessed their personality traits (via a quiz) and other personal characteristics. It then correlated those answers with that they “liked.” The analysis also found that Facebook activity could be useful in predicting intelligence, sexual orientation, and other personal factors.

This is the model that Cambridge Analytica copied. As has been reported, Aleksandr Kogan, a fellow Cambridge academic, copied the methods Kosinski and his colleagues used and then turned around and sold that data to the company that would come to spawn Cambridge Analytica. Users did not consent to this.

And making matters worse, at the time, Facebook allowed apps like these to collect data on users’ friends as well. This is how Cambridge Analytica acquired data on millions of people.

The bottom line here is that our digital footprints leave impressions of ourselves. And as David Stillwell, who co-wrote the 2013 paper, warns on Twitter, Facebook is hardly the only source of this information. The programs that can collect it, and analyze what it all means, are only going to grow more sophisticated.

Plenty of advertisers collect cookie data, which follows us around the Internet recording all of the websites we visit. I'm sure that knowing what websites a person visits can be used to predict their psychology. 3/ — David Stillwell (@david_stillwell) March 18, 2018

2) There’s some limited evidence that personality info can be used to serve you ads you find to be more engaging

So it makes sense that our Facebook data can be used to predict personal information about ourselves. But what about Cambridge Analytica’s pitch that this data could be useful for political campaigns?

The evidence here is very mixed and mostly is in the domain of consumer product research.

One study (also via Kosinski and Stillwell) found that ads could result in 40 percent more clicks if they targeted particular personality types. For a beauty advertisement, extroverts were targeted with the message “Dance like nobody’s watching.” Introverts saw the message “Beauty doesn’t have to shout.” These extra clicks led to more purchases for the retailer.

But making decisions in presidential campaigns is very different from buying makeup. It’s informed not just by our personalities, but by our partisan identities, our ideologies, and our personal history. (And yes, it’s true all these factors may be interrelated.)

Very few papers have addressed whether personality targeting works for political campaigns. Jay Van Bavel, a social psychologist at NYU, pointed me to one unpublished PhD dissertation that addresses the question. It finds mixed evidence on the ability of personality traits to predict who is most likely to turn out to vote and mixed evidence over whether personality traits can predict who is most likely to be persuaded by advertisements.

Other research has shown personality traits are only somewhat useful in predicting voting preferences. A 2009 paper from NYU found parents’ political orientation was much more predictive of voting behavior in the 2008 election than any of the Big Five personality traits. And no factor was more predictive than simply asking people if they were liberal or conservative.

It’s also worth noting: The American public was subjected to an enormous, Russian-backed misinformation campaign in the runup to the 2016 election. And even there, it’s very unclear what impact (if at all) it had on the election results. Likewise, it’s very hard to assess the impact of microtargeting.

3) There’s nearly no evidence these ads could change your voting preferences or behavior

So it’s plausible that personality-matched ads could be more engaging. But what use is that? It’s much, much more likely that microtargeted ads work on reinforcing people’s preconceived notions. It’s very unlikely that they actually changed anyone’s minds on whether to vote for Donald Trump or Hillary Clinton.

Overall, there’s nearly no evidence that political campaigns have any power to persuade voters. Recently, political scientists Josh Kalla and David Broockman conducted a meta-analysis of 49 experiments that were designed to test whether voters are persuadable. The result: “These experiments’ average effect is also zero.”

Their study did find an important nuance, though. As Vox’s Dylan Matthews explained here, they turned up evidence that voters are persuadable when it comes to primary campaigns and ballot measures. But by the time a general election comes along, people are pretty much set in their preferences.

Keep in mind, during the Republican presidential primary, Cambridge Analytica was working for Ted Cruz. We know how well that worked out for him. And for what it’s worth, a former Cruz staffer told the New York Times the service didn’t provide the campaign much value. And at Mother Jones, Andy Kroll outlines how Cambridge Analytica continually frustrated Cruz staffers with technical blunders and missed opportunities. One staffer called them “the least transparent company in the business.”

Furthermore, there’s potential for a microtargeting scheme to backfire. If a person is wrongly targeted with an ad, they may be offended. And there’s some reason to doubt the whole notion that voters respond better to ads tailored to personality traits.

A 2012 experiment out of the University of Massachusetts found that “voters rarely prefer targeted pandering to general messages.” The pandering in this study was on the basis of personal identities (like Latino, gun owner, born-again Christian), and not personalities. But still, it’s important to note that microtargeting’s effectiveness is not a given.

4) Just because you understand a personality type doesn’t mean you’ll be good at making ads to suit it. Our intuitions are often wrong about what we think others will find convincing.

Finally, it’s worth remembering that advertising can be more of an art than a science. Just because you know a person’s personality, it doesn’t mean you’ll be able to craft the perfect ad to pull at their heartstrings.

Here’s an instructive example of that. Last year, I wrote about a research effort focused on trying to persuade people to be less prejudiced against Muslims. The researchers here tested eight videos on around 2,000 study participants and found that only one video was able to move the needle a bit on a measure of collective blame (the tendency to blame all Muslims for the actions of a few terrorists). That’s no small effort.

The authors of the paper did one final, very interesting thing in their experiments. Before they ran the test with the eight videos, they showed them to a wholly different group of 938 participants and basically asked: What do you think will work to change somebody’s mind about collectively blaming Muslims for terror attacks?

Overall, these participants didn’t pick the winning video. Which shows our intuitions about what might convince another person are often wrong.

This finding is common: We often are mistaken about what arguments other people will find convincing. People who design ads for a living are arguably better at this than the average person. Still, this work is hard. And success is not guaranteed.

Further reading on Cambridge Analytica