From unlocking smartphones to authorising payments, fingerprints are widely used to identify people.

However, a team of researchers have now managed to accurately copy real fingerprints and created fake ones called 'DeepMasterPrints'.

Researchers - who created the fake prints using a neural network - were able to mimic more than one in five fingerprints.

These new technological developments suggest fingerprint identification could become increasingly less secure.

Scroll down for video

From unlocking smartphones to authorising payments, fingerprints are widely used to identify people. However, a team of researchers have now managed to accurately copy real fingerprints and created fake ones called 'DeepMasterPrints' (pictured)

'MasterPrints are real or synthetic fingerprints that can fortuitously match with a large number of fingerprints', researchers, led by Philip Bontrager from New York University, wrote in the paper presented at a security conference in Los Angeles.

'In this work we generate complete image-level MasterPrints known as DeepMasterPrints, whose attack accuracy is found to be much superior than that of previous methods.'

The method, which is called Latent Variable Evolution, is created by training a Generative Adversarial Network (GAN) on real fingerprint images.

GANs 'teach' an algorithm about a particular subject - in this case fingerprints - by feeding it massive amounts of information.

GANs consists of two neural networks that learn from looking at raw data.

One looks at the raw data (fingerprints) while the other generates fake images based on the data set.

Fingerprint systems do not generally read the entire fingerprint but just record whichever part of it touches the scanner first, writes the Guardian.

This means they're easier to fake than complete prints.

The GAN created multiple fake fingerprints that matched real ones enough to trick the scanner as well as the human eye.

Researchers found it was able to imitate more than one in five fingerprints when a bionic system should only have an error rate of one in a thousand.

Researchers - who created the fake prints using a neural network - were able to mimic more than one in five fingerprints (stock image)

'The underlying method is likely to have broad applications in fingerprint security as well as fingerprint synthesis', researchers wrote.

They hope their research will help develop more secure authentication systems in the future.

'Experiments with three different fingerprint matchers and two different datasets show that the method is robust and not dependent on the artefacts of any particular fingerprint matcher or dataset.

'This idea is surprisingly under-explored and could be useful in computational creativity research as well as other security domains', researchers found.