A video is circulating online featuring US President Donald Trump hailing the eradication of AIDS.

But all is not as it seems — the deepfake video was developed by French charity Solidarité Sida ahead of a meeting of world leaders in Lyon, France as part of the Global Fund to Fight AIDS, Tuberculosis and Malaria.

The video, first posted on Monday by the charity, appeared to show US President Donald Trump proclaiming: "AIDS is over."

In a statement published on its website titled "Fake today, true tomorrow?'', Solidarité Sida addressed the heads of state who are set to announce their contribution to the Global Aids Fund.

"If strong commitments are made favouring access to treatment for all ... the next generation will be able to live in a world without AIDS," it said.

"Solidarité Sida decided to take action to inform the general public and challenge the leadership of France and the United States ... by broadcasting a piece of fake news. The first piece of fake news that might eventually become true."

What are deepfakes?

"A deepfake is a way of essentially doing a face swap, making someone look like they say something they didn’t,’’ Sam Gregory, an expert on deepfakes and Program Director of non-profit Witness, which promotes the use of video to defend human rights, told Euronews.

Clips with manipulated audio and visuals, deepfakes use an artificial intelligence technique called machine learning, which make alternative realities easier to create and share.

The most commonly circulated deepfakes are pornographic and politically motivated, and are on the rise.

Research from cyber-security company Deeptrace released this week found there were 14,698 deepfake videos online, compared to 7,964 in December 2018.

The report said 96% were pornographic, often with a computer-generated face of a celebrity replacing that of an adult actor in a scene of a sexual nature.

Ahead of the 2020 US election, steps are being taken to combat deepfakes amid growing concerns over how they could influence voters

Last week, California’s governor, Gavin Newsom, signed legislation making it illegal to create or distribute videos, images, or audio of politicians doctored to resemble real footage within 60 days of an election.

Should deepfakes be used by a charity?

Gregory told Euronews: "The reality is, most deepfakes are used maliciously, they’re used to target women, and are increasingly likely to disrupt our societies.

"I think we need to discuss the ethics of using deep fakes — this certainly isn’t the first deepfake used by a charity or political group."

Earlier this year, charity Malaria Must Die used deepfake technology to manipulate David Beckham’s voice to deliver an appeal to end malaria in nine languages.

However, there is a clear difference between the two deepfakes: Beckham gave consent, whereas Solidarité Sida’s deepfake is self-proclaimed ‘’fake news’’.

"The question of whether people should be using deepfakes for good, as in this context, is something we need to discuss," said Gregory.

What responsibility do platforms have?

Gregory highlighted that the deepfake video of Trump "clearly names itself as satire it’s also pretty clear it’s a fake, it’s not a completely convincing deepfake’’.

But it raises the question of how social media companies should moderate deepfakes and whether satire is be allowed.

"One of the big questions for platforms is how they should label things if they do share them.

"This one says halfway through it’s a deepfake, but should platforms really affirmatively be telling people, given that as this technology gets better, we may not be able to see it in the way we can with this one," Gregory said.

The expert said he thinks social media companies have a responsibility "to tell us what is a deep fake, or what is one of these other manipulations, maybe a manipulation of the lips or audio.

"They are very well placed to do that, because they will see these deepfakes passing through their systems.

"So it’s really important they develop detection techniques that are going to be available for ordinary people to be able to see signals of manipulation that might be invisible to naked eye or ear.

"There are some encouraging early signs of that — companies like Facebook are investing in making training data more available, we need to see more of that, the building of a shared immune system of detection."