Facebook’s facial recognition research project, DeepFace (yes really), is now very nearly as accurate as the human brain. DeepFace can look at two photos, and irrespective of lighting or angle, can say with 97.25% accuracy whether the photos contain the same face. Humans can perform the same task with 97.53% accuracy. DeepFace is currently just a research project, but in the future it will likely be used to help with facial recognition on the Facebook website. It would also be irresponsible if we didn’t mention the true power of facial recognition, which Facebook is surely investigating: Tracking your face across the entirety of the web, and in real life, as you move from shop to shop, producing some very lucrative behavioral tracking data indeed.

The DeepFace software, developed by the Facebook AI research group in Menlo Park, California, is underpinned by an advanced deep learning neural network. A neural network, as you may already know, is a piece of software that simulates a (very basic) approximation of how real neurons work. Deep learning is one of many methods of performing machine learning; basically, it looks at a huge body of data (for example, human faces) and tries to develop a high-level abstraction (of a human face) by looking for recurring patterns (cheeks, eyebrow, etc). In this case, DeepFace consists of a bunch of neurons nine layers deep, and then a learning process that sees the creation of 120 million connections (synapses) between those neurons, based on a corpus of four million photos of faces. (Read more about Facebook’s efforts in deep learning.)

Once the learning process is complete, every image that’s fed into the system passes through the synapses in a different way, producing a unique fingerprint at the bottom of the nine layers of neurons. For example, one neuron might simply ask “does the face have a heavy brow?” — if yes, one synapse is followed, if no, another route is taken. This is a very simplistic description of DeepFace and deep learning neural networks, but hopefully you get the idea.

Anyway, the complexities of machine learning aside, the proof is very much in the eating: DeepFace, when comparing two different photos of the same person’s face, can verify a match with 97.25% accuracy. Humans, performing the same verification test on the same set of photos, scored slightly higher at 97.53%. DeepFace isn’t impacted by varied lighting between the two photos, and photos from odd angles are automatically transformed (using a 3D model of an “average” forward-looking face) so that all comparisons are done with a standardized, forward-looking photo. The research paper indicates that performance — one of the most important factors when discussing the usefulness of a machine learning/computer vision algorithm — is excellent, “closing the vast majority of [the] performance gap.”

Facebook tries to impress upon us that verification (matching two images of the same face) isn’t the same as recognition (looking at a new photo and connecting it to the name of an existing user)… but that’s a lie. DeepFace could clearly be used to trawl through every photo on the internet, and link it back to your Facebook profile (assuming your profile contains photos of your face, anyway). Facebook.com already has a facial recognition algorithm in place that analyzes your uploaded photos and prompts you with tags if a match is made. I don’t know the accuracy of the current system, but in my experience it only really works with forward-facing photos, and can produce a lot of false matches. Assuming the DeepFace team can continue to improve accuracy (and there’s no reason they won’t), Facebook may find itself in the possession of some very powerful software indeed. [Research paper: “DeepFace: Closing the Gap to Human-Level Performance in Face Verification“]

What it chooses to do with that software, of course, remains a mystery. It will obviously eventually be used to shore up the existing facial recognition solution on Facebook.com, ensuring that every photo of you on the social network is connected to your account (even if they don’t show a visible tag). From there, it’s hard to imagine that Zuckerberg and co will keep DeepFace purely confined to Facebook.com — there’s too much money to be earnt by scanning the rest of the public web for matches. Another possibility would be branching out into real-world face tracking — there are obvious applications in security and CCTV, but also in commercial settings, where tracking someone’s real-world shopping habits could be very lucrative. As we’ve discussed before, Facebook (like Google) becomes exponentially more powerful and valuable (both to you and its share holders) the more it knows about you.