Symantec says it has seen three cases of seemingly deepfake audio of chief executives being used to dupe financial controllers into handing over company funds. Deepfakes are highly realistic and convincing doctored images, video or audio.

Using artificial intelligence (AI) that combines and superimposes material onto original content, deepfakes are made with a machine learning (ML) method known as a general adversarial network.

While the technology used to create deepfakes has been adopted by Hollywood for movies, it has also become a tool of cybercriminals and those looking to spread misinformation.

Recommended:

According to Symantec, deepfake audios can be created by training an AI system with audio of chief executives that is freely available. Audio tracks from media appearances and corporate video calls are just some of the sources cybercriminals can use to gain the necessary input needed to build a model of a target’s voice.

Using background noise they can make the voice flow realistically by concealing unnatural sounding syllables and words. According to Dr Alexander Adam, a member of the data science consultancy team at Faculty, a substantial amount of money and time would be required to make a high-quality deepfake audio.

“Training models cost thousands of pounds. This is because you need a lot of computing power and the human ear is very sensitive to a wide range of frequencies, so getting the model to sound truly realistic takes a lot of time,” he said.

It would take a vast amount of good quality audio to get the data needed to create a realistic-sounding version of the target’s speech patterns, he added.

The use of Deepfake technology by unscrupulous individuals has sparked great concern by anti-revenge porn groups over its ability to create realistic fake nudes of women. Legal experts and political commentators have voiced worries that the tech could potentially be used as a tool for propagandists, and could be used to incite violence, influence elections or even initiate a public safety crisis.

Like this: Like Loading...