But are the social media masses being duped? Mounting evidence suggests that the president authors only some of his tweets. And those famous campaign phrases and hashtags of “drain the swamp” and “deep state”? According to Chris Wylie, a Cambridge Analytica whistleblower, Stephen K. Bannon directed the testing of these messages in 2014, long before Trump signed up for the presidential race.

AD

AD

Yes, the Kremlin’s manipulation of social media is a threat to U.S. democracy. But some of the most damaging efforts I’ve seen lately are American, not Russian, and they’re far more technically capable than those of the Internet Research Agency that was indicted by special counsel Robert S. Mueller III in February. It’s time we started paying attention to the political campaigns and public-relations firms exploiting social media to drive audiences apart online and constituencies against each other at the ballot box. Western opportunists will adopt the Kremlin’s information warfare art but will apply a more devastating power — that of artificial intelligence — to sway audiences through social media assaults.

Cambridge Analytica’s much-touted use of social media-generated psychographic voter targeting may have been more aspirational than factual, a touch of digital snake oil in the pursuit of clients. But Bannon and Cambridge Analytica were naturally and logically pursuing the next advance in political influence — campaigning that is more science than art, manufactured populism guided by hidden influencers who understand microscopic audience preferences and the psychological vulnerabilities of voters.

Cambridge Analytica’s harvesting of Facebook accounts and pairing with voter profiles merely represents a small first step for social manipulation. Advanced public relations firms, propagandists and campaigns, now and in the future, seek a full digital pattern-of-life on each potential voter. Every like, retweet, share and post on all social media platforms will be merged and matched with purchase histories, credit reports, professional résumés and subscriptions. Fitness tracker data combined with social media activity provides a remarkable window into just when a targeted voter might be most vulnerable to influence, ripe for just the right message or a particular kind of messenger.

AD

AD

Future campaigns will pick not just the issues and slogans a candidate should support, but also the candidate who should champion those issues. Dating apps, the aggregate output of thousands of swipes, provide the perfect physical composite, educational pedigree and professional background for recruiting attractive candidates appealing to specific voting segments across a range of demographics and regions. Even further in the future, temporal trends for different voter blocks might be compared to ancestry, genetic and medical data to understand generational and regional shifts in political leanings, thereby illuminating methods for slicing and dicing audiences in favor of or against a specified agenda.

Rapidly compiling social media in pursuit of big-data reconnaissance on voters requires artificial intelligence (AI). Machine learning, an AI application where machines learn without explicit programming, will rapidly pore over data troves and illuminate key insights with limited human intervention. Once audiences have been scoped, they’ll need to be prodded, and new innovations will provide scary capabilities for social media audience manipulation.

False information, printed text, spread via news stories outperformed true information in the run-up to the election. But fake video and audio can offer strikingly real impressions of world leaders appearing to be in places they’ve never been, saying things they never said. This forgery capability will offer nefarious social media manipulators the ability to inject powerfully engaging smear campaigns into political discussions — or an opportunity to cast doubt about the authenticity of information by alleging that content might be doctored.

AD

AD

Russian interference in Western elections in 2016 heightened concerns about computational propaganda. False social media accounts looking like and communicating like the target audience, known as social bots, repeated programmed messages and amplified political content altering users’ perceptions of reality and influencing debate. 2016’s social bots will appear crude in comparison to the AI-powered chatbots of 2018 and beyond. Newer chatbots, computer programs simulating real conversation, increasingly pass the Turing test, in which a machine exhibits behavior indistinguishable from a human. Bots might seamlessly chat with humans and each other, creating engaging bot communities.

Brad Parscale’s promotion from Trump’s digital director in 2016 to campaign manager for the president in 2020 shows just how important social media campaigns will be in U.S. elections. Campaigns will employ social media not to broaden debate through open discussion, but to harden the views of their social media adherents through deliberate information partitioning. They’ll recruit supporters on mainstream social media platforms and push them to apps they design, control and leverage to harvest voter data.