Facebook disinformation in the 2020 presidential election: What you can do to stop its spread

Jessica Guynn | USA TODAY

Show Caption Hide Caption Analysis: Facebook Fights Russian Disinformation Michael Posner, Director of NYU's Center for Business and Human Rights says Facebook and the Trump administration isn't doing everything they could to prevent Russian hackers from spreading disinformation on social media. (July 31)

We all consider ourselves experts on separating fact from fiction. Sharing disinformation on social media is something that happens to other people.

Let’s be honest, when we’re cruising through our feeds, sharing or liking baby pictures and cat videos, we don’t always have our critical thinking caps on, especially when confronted by emotional appeals.

On social media, we trust information we get from the people close to us. But there are bad actors out there who are increasingly exploiting the ways we share information with one another during election cycles. For example, in 2016 and again in 2018, Russian agents posed as people on both sides of hot-button issues to foment distrust and discord. And, according to a recent study by Avaaz, a nonprofit that focuses on social media disinformation, fabricated news stories got 86 million views in the past three months, more than three times as many as during the previous three months.

So how do we stop falling for and spreading bogus information?

Disinformation isn't just for dummies

Disinformation is the deliberate spread of verifiably false or misleading content to gain financially or to deceive the public. It exploits our biases and our behavior: what we read, what engages us and what we send to friends to serve up social media posts, images and videos which are one-sided or divisive.

Researchers say this kind of disinformation is dangerous precisely because it can dupe anyone with broad consequences for our society and democracy.

"If we look through timelines, most of us will find we are sharing disinformation on social media," says University of Washington associate professor Kate Starbird, who researches online disinformation. "It's hard to always make the right decision."

Privacy: New Calif. act may help take back your privacy and give you more rights over your data

Privacy in the 2010s: How technology made us bid farewell to privacy in the last decade

Social media users, young and old, are susceptible. On average, users over 65 shared nearly seven times as many articles from fake news domains as the youngest age group, one study found.

Young people are also easily tricked, according to another study. For instance, 52% of students believed a grainy video shot in Russia claiming to show ballot stuffing in the 2016 Democratic primaries was "strong evidence" of voter fraud in the U.S. Only three out of 3,000 tracked down the source of the video, despite a quick internet search turning up a number of articles debunking it.

Sharing is not always caring

The first rule of fighting disinformation? Think before you share. And, when in doubt, don’t share at all.

“If you see something you get really excited about, that’s the moment not to send it to other people,” says Paul Barrett, deputy director at NYU Stern Center for Business and Human Rights. “That’s the moment to stay calm and double check. That piece of content will still be there in five minutes or in five hours.”

So think of passing along disinformation as the digital equivalent of sharing an STD with your friends (and then their friends and so on). Sure, the disinformation may encounter some skeptical bumps along the way, but the reality is that just repeating something often enough that isn’t true – or mostly isn’t true – helps convince people that it is.

Beware ragebait, social media that traffic in fear

Does that Facebook post make your blood boil? Or does it make you fearful of the future? Those are two of the reddest flags out there. But there are more.

Does something strain credulity (oh hey, the pope endorsed Donald Trump) or conform a bit too neatly to your hyperpartisan beliefs (remember piling on the Covington students)? Or is something just so heartwarming and uplifting that you can’t wait to share it with all your friends?

Take a deep breath, put your phone down and ask yourself: Is someone deliberately trying to make me feel this way? And, if so, who are they and what is their agenda? If you have any uncertainty about the validity of a post, spare your friends and family.

"People are much easier to manipulate when they're angry or afraid. Con artists have known that for centuries," says disinformation expert Ben Nimmo, director of investigations at Graphika, a social media analytics company. "If you see a headline that makes you outraged or scared, ask yourself: Who's trying to trick me?"

Don’t trust and be sure to verify

Disinformation is often tough to spot because sometimes it’s not technically false. Instead social media posts dangle half-truths or twist facts with something made up or offers up information which is completely out of context.

Peddlers of disinformation are banking that, when they target us with just the right internet meme, our innate biases will take over and call the shots. So before sharing anything on social media, the best strategy is to scrutinize the content the way your corner shopkeeper holds a phony $100 bill up to the light.

Russian disinformation: We read every one of the 3,517 Facebook ads bought by Russians. Here's what we found

Bad actors: Russian Facebook ads inflamed Hispanic tensions over immigration after Trump election

Ask yourself: Who is the person publishing this information? Is this person reliable? What else has this person posted? Are the claims in the post being backed up by reputable sources? (Your Aunt Sally does not count.)

Are the facts getting distorted? Are perspectives that are different from yours being left out? Does the person publishing this information have something to gain? Are other legitimate websites publishing the same information?

Be on the alert for posts about voting, such as warnings of long lines at polling places, voter fraud or malfunctioning voting machines. Disinformation peddlers also traffic in fake endorsements, so give those extra scrutiny, too, says the News Literacy Project, a nonpartisan, national education nonprofit.

And another tip: Never share a post if you've only read the headline. There are plenty of sensational and misleading headlines that drive clicks but don't accurately reflect the facts.

You can’t always trust what you see or hear

People instinctively trust images more than words and misinformation peddlers often try to use this against you, the News Literary Project warns.

Increasingly lawmakers, researchers and social media companies worry that digitally manipulated images, videos and even audio will emerge as insidious new threats in the 2020 presidential election. So-called deepfakes – videos that have been doctored with state-of-the-art artificial intelligence – are not yet widely used for disinformation, but the fear is that they will be next year.

The challenge: The technology is evolving quickly, making these videos harder to detect. There are also “cheap fakes,” videos that are doctored with more rudimentary software. Disinformation peddlers also tinker with images and audio.

So do some homework to avoid images and videos that are fake, hoaxes, outdated or taken out of context, the News Literacy Project advises. Sometimes that's as simple as a Google search. Or you can use Google image search or TinEye to check where images come from and whether they’ve been manipulated. Just think: It could save you from the humiliation of being that person who shares the "Hurricane shark" during the next big storm.

Disinformation needs a crowd

When it comes to disinformation, we are not just talking about Kremlin-linked operatives who are paid to produce shady content or bots that pepper social media with automated posts. In the run-up to the 2020 election, these campaigns may be led by the Iranians, the Chinese or folks in our own backyard, disinformation researchers say.

Effective campaigns rely on recruiting “unwitting collaborators” who are unaware that they are amplifying and legitimizing messages that exist solely to inflame tensions over race, guns, abortion or immigration and undermine faith in our institutions, Starbird says.

Back in the day, before the internet and social media, that meant manipulating journalists. Now, as members of online communities, we all have bull’s-eye targets on our backs. And, because they hide among us, it’s much harder for social media companies and disinformation researchers to identify bad actors who may be working to, say, discourage African Americans from voting or to organize anti-immigration protests.

Especially if you are active in politics, exercise extra caution. Don't just pass on information without first checking it out. Remember that you are lending credibility and visibility to the person whose information you are sharing.

"That's how the (Russian) Internet Research Agency trolls got traction, by repeating what everyone else was saying and picking up on legitimate grievances," Starbird says. "They became part of the group and then it was hard to distinguish them from everybody else."

Beware ragebait: How a few shady social media posts fed a viral firestorm over Covington Catholic (and why it will happen again)

How falsehoods spread on social media: How a lie about George Soros and the migrant caravan multiplied online

Avoid the 'tribal trap'

Whether you are worried about being hoodwinked by disinformation or not, here's a good tip: Don't forget your – or someone else's – humanity.

Nimmo calls it the "tribal trap." We dehumanize the person whose political views we reject. Challenge yourself to find one thing in common with the person who has posted something you disagree with, he advises.

"It's the first step in preventing the cycle of anger that makes an easy target for disinformation," Nimmo says. "Meeting people in real life is much more nuanced: You don't just hear someone's political views, you also hear their accent, and see their face and their body language; maybe they show what sports team they support, or where they went to school. Those are all things that make them more human."

Final tips to make yourself disinformation-proof

Change the mix of content you see in your feed by seeking out reliable sources of information that offer viewpoints that are different than your own.

If you do share something that turns out to be false, own it and correct the record.

And, finally, don’t get overwhelmed and let disinformation win. Keep engaging in the public discourse and debate that can bridge differences, forge compromises and help us understand one another, researchers say.