Modern artificial intelligence is way beyond playing chess; it has mastered Go and kicks butt in Dota 2, among other games.

What started as a test-lab monkey has evolved into something akin to a prodigy child. Artificial intelligence, or AI, may still have to be fed information, but once it has gathered enough, it can come up with results that mimic the original data. First came the static images — AI managed to create perfectly convincing images of people who have never existed. Then it showed it was perfectly capable of mimicking different seasons.

So we have locations that can be photo-manipulated and non-existent people who can populate them. All we need now is for AI to figure out how to put real people in fake situations and voila! — a perfect recipe for generating fake news.

Well, that has already been already done. Take, for example, this video:

Using audio of former President Barack Obama, researchers from Washington University synthesized a high-quality fake speech with accurate lip-syncing, composited into a target video clip. After having trained at Obama’s weekly address footage for hours, a recurrent neural network (AI) learned the mapping from raw audio features to mouth shapes.

This level of content manipulation is impressive, but also rather scary, isn’t it? I can easily imagine someone using fake footage like this to incite unrest or spread misinformation among the general public. However, this also seems like something only big media could do, right? It looks too high-end and too complex for an ordinary person to get their hands on. Besides, if they could harness all this power of AI, what would they do with it?

Read:Almost 6 billion people will suffer from water shortages by 2050, U.N. report finds

Create fake celebrity porn, as it happens. I won’t share any links here, but just a few weeks ago, the internet was in an uproar over fake porn footage, courtesy of horny Reddit users who (ab)used AI to put Emma Watson and dozens of other celebrities into a series of raunchy shoots. In the end, Reddit closed the thread, and porn sites tried (unsuccessfully) to remove the fake footage.

The larger issue isn’t that someone made a fake adult video of a celebrity. What’s worrying is the state of authentic information in modern society. Anyone can take a piece of content and mold it into something else — a school shooting that never took place, for example. A testimony that never happened. A press conference with words being put into a speaker’s mouth at crucial moments. The list goes on, and the result is a growing mistrust of news served to us, regardless of the rich media content behind it.

As AI becomes better at forging video (and audio!), it will be increasingly more difficult to distinguish fabrications from real events. In the future, censorship will be unnecessary — instead of deleting information, regimes will simply flood the media with modified forgeries, causing confusion and ambiguity in the public.

Finally, let me quote my fellow journalist Oli Franklin from Wired: “The biggest casualty to AI won’t be jobs, but the final and complete eradication of trust in anything you see or hear.”

P.S. Yes, AI could very soon be able to write articles like this one.