Aug 22, 2016 | By Alec

Security systems that are unlocked by simply showing your face – it’s just about the coolest thing ever. Makes you feel like Tom Cruise whenever you unlock your smartphone. But as researchers from the University of North Carolina revealed, our obsessions with Facebook and Instagram could be our own undoing. Through a series of experiments, they found out that 3D models and even 3D printed masks based on Facebook photos can be used to crack four out of five facial recognition security systems.

That is very disconcerting, as these biometric security systems have often been touted as the new solution to bolster digital security, and are rapidly becoming a prominent tool for authentication. In fact, many banks, including HSBC, Barclays and Citi, are already incorporating or using biometric security solutions. The concept itself is also very popular, with one Visa study revealing that more than 68% of customers in seven European countries are open to using biometric technology for payments.

The startling conclusion was made in the article Virtual U: Defeating Face Liveness Detection by Building Virtual Models from Your Public Photos by a team of researchers from the University of North Carolina, who presented it at the Usenix security conference earlier this month. Now you could be one of those people who are very careful of posting photos online, but as it turns out you have very little control over this issue. Friends and family might post photos of you online, while even just three low quality photos from a few years ago could be enough to reproduce 3D models that can be used to trick your authentication systems.

As a result, the researchers went as far as calling this an “immediate and very serious threat” to security. They further argued that “VR-based spoofing attacks constitute a fundamentally new class of attacks that point to serious weaknesses in camera-based authentication systems: Unless they incorporate other sources of verifiable data, systems relying on color image data and camera motion are prone to attacks via virtual realism.”

Taken from the slides of the Unsenix presentation.

This was illustrated at Usenix, where they demonstrated a system that uses digital 3D facial models to successfully unlock facial software. They also underlined just how easy it was. A team of twenty volunteers (most of whom were security researchers who did not share everything on the web), allowed themselves to be ‘cyber-stalked’. Through Facebook, LinkedIn and similar websites, they were able to a few photos for everyone. One participant only uploaded two photos in the last three years, but even here the web provided enough data.

That data was subsequently used to build 3D models of faces, with missing areas, shadows or textures being filled in by hand. Even facial animations (frowning, smiling and so on) were added. These models are actually so detailed, that the 3D faces move accordingly when devices rotate. “To an observing face authentication system, the depth and motion cues of the display exactly match what would be expected for a human face,” they say.

In total, five facial authentication systems that are readily available to consumers were tested: Mobius, KeyLemon, TrueKey, BioID and 1D, which are all available via the iTunes Store or the Google Play Store. It turned out that four of the five systems could be hacked, with success rates between 55% and 85%. But when proper headshots were taken of each subject in an indoor environment, all five systems could all be breached – underlining just what smartphone cameras can achieve. “Our exploitation of social media photos to perform facial reconstruction underscores the notion that online privacy of one's appearance is tantamount to online privacy of other personal information, such as age and location,” they concluded.

But the problem is that this threat cannot be completely neutralized in the age of social media. The only possible solution is that face recognition tools are beefed up to be able to recognize fraud, and keep up with 3D developments. In particular, the ability to reject synthetic faces with low-resolution textures is absolutely necessary, while features such as light projection patterns, illuminated infrared sensors and the recognition of minor skin tone fluctuations related to pulse could all help. “It is our belief that authentication mechanisms of the future must aggressively anticipate and adapt to the rapid developments in the virtual and online realms,” they argue.

While they did not actually test a 3D printed mask, the North Carolina team argued that such a mask should provide the same results, given today’s data goldmine that we are all providing to hackers. There is thus a lot of work to do for facial recognition software developers. “Even if a system is able to robustly detect a certain type of attack – be it using a paper printout, a 3D-printed mask, or our proposed method – generalizing to all possible attacks will increase the possibility of false rejections and therefore limit the overall usability of the system. The strongest facial authentication systems will need to incorporate non-public imagery of the user that cannot be easily printed or reconstructed, [such as a skin heat map],” they conclude. It seems like security systems are definitely not yet up to par with the potential of smartphone technology.

Posted in 3D Printing Application

Maybe you also like:









isaac gibson wrote at 3/16/2017 3:02:02 PM:i only have one testicleDustin wrote at 8/24/2016 3:23:11 AM:All biometric "security" is flawed, but it's great for governments since they can unlock anything you have, if they have you.RobinLeech wrote at 8/22/2016 7:27:34 PM:Face-scanning is not cool, and only fools would want to "feel like Tom Cruise" in propaganda that programmed them to think it's "secure" despite all logic. Face-scans including those on Facescanbook are a threat to the personal security of everyone, even those that don't willingly participate. The solution is to abandon the idea, not to force it, escalating the problem with heat signature(forging). Using your face or fingerprints or voice doesn't make things more secure, it just helps track you and gives your identifying traits away. For example Facebook, or a "security" service could willingly or unwillingly give your biometric data to anyone from a malicious hacker at the NSA to a ISIS "rebel", and impersonate you, track you, or even frame you for a crime. Being able to crack this is a win for the personal security of everyone and shouldn't disconcert intelligent people. It's only a problem if you were foolish enough to participate in it thinking it made you more secure rather than less, or if you're using it to track people like the NSA does and Adolf would have.



