This story is part of What Happens Next, our complete guide to understanding the future. Read more predictions about the Future of Fact.

If the world we live in today is already being described as “post-truth,” how will it be described in 10 or 20 years? It’s hard to imagine today’s major social media companies like Facebook and Twitter continuing their roles as our digital linchpins. Given their recent failures to protect democratic information flows, they risk becoming “legacy” social media companies as quickly as they became “new” media.

“Post-post truth” doesn’t exactly have a ring to it, but it’s the most literal moniker for our future communication ecosystem. On the one hand, the digital sphere could become an enforced environment: a realm of constant identity verification, with real-time social bot eradication and digital disinformation police. On the other, it may look like a post-modern spectacle where ground truths are always questioned and confusion reigns supreme. The reality will likely combine features from both, with control oscillating between vindicators and vigilantes.

Any changes to technology or media will have serious consequences for culture and consumers. If the future world is even less grounded in fact than it currently is—more bound up in partisan mythology and a digital marketplace that rewards lies with profit—how will society be affected?

Computational propaganda will be the fake news of the future.

The answer will not be decided democratically, but by computational propagandists. Computational propaganda is the attempt to manipulate public opinion through automation and algorithmic systems. On a basic level, nefarious actors use sites like Facebook, Twitter, and YouTube to artificially spread political junk. A lot of this is pure spam, intended to gum-up online conversations among activists or political opposition. During the recent Mexican election, for instance, political bots in the form of automated fake profiles were used to impair communication between activists. Other tactics are targeted and more sophisticated, using coordinated human users alongside automated social media accounts to promote particular ideas to various social groups. We most recently saw these techniques during the Russian and Iranian campaigns built to influence the 2018 US midterms.

Rather than grassroots politics, the future is made of Astroturf. The goal of these communication strategies is to amplify or suppress political information through lies and confusion. Those who use it manufacture false consensus and give the illusion of popularity or disapproval. This creates a bandwagon effect—and the more people who jump on the bandwagon, the harder it will be to slow it down.

The continued rise in public consciousness of conspiracy movements like Pizzagate or QAnon makes it harder to reverse the flow of disinformation. Combined with the social fallout associated with disinformation, many tremendously powerful technology companies are being brought to a very public mea culpa moment. It seems that every week Twitter, Google, or Facebook are implicated in a new propaganda scandal. They are being (rightfully) forced to keep up with the Alex Joneses of the world, trying to preordain how conspiracy peddlers will game their algorithms next. Any fault to not mitigate this is their own: the tools that today’s tech behemoths have created grew so fast, and with such disregard for introspection, that they’ve become unmanageable.

Author's screengrab, 6/15/17 A screen capture from @DyanNations, one of several hundred accounts conservative strategist Patrick Ruffini alleged was used to attack Ted Cruz on behalf of Donald Trump. In addition to tweeting pro-Trump messages, the account regularly sent out Russian memes and ads for fake followers.

As people become aware of the mélange of propaganda and scams on open social media networks like Twitter, they will move to closed networks like WhatsApp. But this exodus from public channels has its own problems, including the worsening of echo chambers. It’ll become harder to connect with verifiable information on the outside web as these apps become walled gardens.

In an attempt to stymie problems before they arise, platforms will become increasingly privacy oriented, barring peoples’ connections to unknown users and using tougher and tougher verification methods. They will effectively be publicly regulated, if not by governments then by public opinion. While politics remains hogtied by its own partisanship and a lack of understanding of new media, citizens will act the role of a mercurial public jury. Companies’ PR campaigns about what they are doing to address disinformation will continue to grow, and they will become more transparent about the groups working to manipulate their platforms and the algorithms that allow for said manipulation.

But if social media companies, governments, and the public don’t begin to generate actual solutions to this problem—rather than one-dimensional transparency reports—then the next generation of children may be born into a world where it is nearly impossible to tell truth from fiction both on and offline. Some groups are already working on ideas. My own lab at the Institute for the Future recently collaborated with game designer Jane McGonigal and the Omidyar Network to develop the Ethical OS Toolkit, a series of future-focused exercises geared toward helping tech designers develop digital products that consider the benefit of society. Groups like Witness put digital power in the hands of the people by allowing citizens to produce videos that work to protect human rights. More ideas are popping up everywhere—good, bad and in-between: using blockchain for verification and voting; structural banning of social bots; and doing away with online anonymity.

Whatever happens, we cannot go back to where we were; social media and the internet are no longer blindly assumed to be vehicles for democracy and the open flow of information. In the vacuum that has been created, there is space for the next generation of actors to promote a new vision of online communication. In order for this to happen, however, they themselves must not get sucked into the vacuum of conspiracy and confusion.

This story is part of What Happens Next, our complete guide to understanding the future. Read more predictions about the Future of Fact.