Cindy L. Otis

Opinion columnist

A series of fake websites appeared a few weeks ago and pretended to be official campaign websites of Democratic presidential candidates. Then, in the days before the first Democratic debate last month, I watched users on 4chan, an imageboard website with a large far-right member base, eagerly discuss plans to spread disinformation about the candidates on social media in what they see as the next information war. In one thread, hundreds of users discussed photos and memes they could make to turn voters against Democrats.

For those of us who analyze disinformation, it has started looking a lot like the run-up to the 2016 election. The surprising thing? Americans were behind both of these “fake news” operations.

While much of the public and media attention has focused on foreign attempts to influence the results of U.S. elections, the fact is actors like Russia and Iran are not the only players we need to worry about for 2020. If anything, domestic actors are poised to be the bigger information threat.

Russia can build on US disinformation

This does not mean Russians and others will stay out of 2020. Rather, they will be able to amplify divisive narratives and false information already being created by domestic actors — giving them more time and resources to focus on other potentially disruptive efforts. We most recently saw this during the European Union’s parliamentary election in May. Russia determined it did not need to be the driver of disinformation campaigns in most EU countries and primarily looked to amplify false content already being publicized by far-right groups.

On another front, the Democratic National Committee has already told the presidential campaigns to immediately delete FaceApp, which allows users to alter faces, such as aging them. It was developed by Russians, the DNC said, and using it is risky.

The real 2020 moonshot:Beat Russian hackers and protect our elections: Donna Brazile

In the 2020 campaign, Americans can expect to see six key kinds of disinformation attacks. First, fake news websites will continue to host outright falsehoods about candidates and disinformation actors will use fake social media accounts to push out the content. There will also be fake websites pretending to be official campaign websites, both to attract donations and to show false information about the candidates.

Second, American voters will be targets of misleading social media posts, such as memes that pair digitally altered or real photos with text that intentionally misrepresents a candidate's position or statement. Disinformation actors will also post links to legitimate news articles, but then mischaracterize what is reported in the text of the post. They do this counting on the fact that most social media users will not take the time to verify what was said or quoted.

Third, campaigns and journalists will be subject to derogatory posts and claims intended to attack a person’s character. For example, an approach I already see in use on social media is claiming Democratic presidential campaigns are being aided by specific foreign countries, like Ukraine, Israel and Qatar, to distract from Russia’s efforts to help elect Trump in 2016. This was a tactic users on 4chan and 8chan discussed months ago.

Promoting losers and division

Fourth, disinformation actors will focus on content that divides and inflames political and racial tensions, knowing that the best-case scenario for them is for America to enter 2020 with a fractured Democratic Party.

Fifth, disinformation campaigns will seek to boost lower-polling candidates who have a smaller chance of beating Trump. During the two-day Democratic debate, accounts on 4chan and Reddit directed users to vote for Rep. Tulsi Gabbard, entrepreneur Andrew Yang and activist Marianne Williamson in online polls because they believed the three candidates stood the least chance of winning in 2020.

Democratic security chief: We all need to get serious about election security. You too, Republicans.

And sixth, candidates will continue to be prime targets for hackers looking to gain access to their systems and sensitive information that they can make publicly available. As we saw with the hacks in 2016 when Russia targeted the Clinton campaign and Democratic National Committee servers, the information does not even have to be damning for disinformation actors to use hacked content to paint a candidate as untrustworthy. We have not yet seen this happen to a 2020 Democratic candidate, but it is only a matter of time.

Disinformation actors have multiple tools at their disposal because information operations are cheap and relatively easy to conduct. We will see more fake social media accounts, including bots and trolls, across platforms. They will create fake social media groups and pages that try to attract real members using sensational content. They will also increase their use of platforms beyond Facebook and Twitter (which have heightened security measures), among them Medium, blogs, and news applications like Parler, in order to reach a larger audience.

Voters and candidates need vigilance

With no help from Congress on the way to fortify our election security, candidates running for office in 2020 need to take matters into their own hands. They should assume they are already targets of disinformation attacks and have a plan in place to protect their campaigns.

They must devote resources to understanding the disinformation threat landscape and monitoring potential attacks, enlisting the help of experts in the field. Campaigns must be transparent and expose attacks and attackers they have identified. They must also invest in network security and training for staff so that hackers cannot gain access to sensitive information and networks. Most importantly, candidates must work together to combat disinformation and agree not to help promote false information about each other.

With so many people running for president and so many bad actors trying to spread disinformation about them, it will be difficult to determine what is “fake news” and who created it. The question is not if or when there will be disinformation campaigns, because they have already started. Candidates — and voters — must be ready.

Cindy L. Otis, a member of USA TODAY’s Board of Contributors, is a former CIA officer who now works in cybersecurity. Her new book, “True or False: A CIA Analyst’s Guide to Identifying and Fighting Fake News,” will be published in May. Follow her on Twitter: @CindyOtis_