The discussion of iPhone facial recognition tech has largely been within the context that "Apple really wanted to do TouchID under the screen, but couldn’t pull it off in time." So the common conclusion is that they went with facial recognition as a "Plan B" solution and that’s it…

I would like to suggest the possibility that this was never about TouchID and that the tech is WAY bigger than just a method for user authentication. I believe it may be four things in one:

Facial recognition

Precise eye tracking.

Focus/awareness tracking.

And emotion tracking.

If true, this would create entirely new ways to interact with your iPhone. First let’s talk about authentication…

"But TouchID requires intentional action to unlock…"

Many are concerned that Facial Recognition is weaker than TouchID, because with touch the user has to take an intentional action by placing their finger on the home button. If all someone has to do is hold my phone up to my face to unlock it - that seems like a security risk, right?

I suspect simply looking at your device will initiate a visual prompt saying something like "blink twice to unlock" or "swipe up with your eyes to unlock" (where you simply look at the bottom of the screen and then the top to unlock). For more advanced security we may even see the equivalent of "pattern unlock" from Android, but with eye tracking. In this case, every time you glance at your locked phone a grid of nine dots would appear. Then you simply trace your custom pattern using your eyesight and the phone unlocks without you ever lifting a finger.

New interactions we can’t yet fully imagine…

Eye tracking, attention tracking, and emotion tracking could create so many new possibilities. These features may be limited to Apple apps in iOS 11, but iOS 12 could open this up to developers… so their apps could detect happiness, sadness, and whether or not you are actively focused on the interface.

We might see social apps that track user happiness or sadness to automate the "like" or "sad face" reactions to posts, images, or videos. The result could potentially be invisible to the user and simply used to improve the relevancy of your news feed through advanced algorithms (based on consensus emotional reactions and many other factors). Likewise YouTube could determine whether or not a video is actually holding your attention or if you just left it on in the background… and then customize future video recommendations based on how well certain videos hold your attention. Who knows, perhaps comedy clips will one day be judged on their ability to actually make you smile and laugh.

Depending on how precise the eye tracking is - we could see a future Kindle app that automatically turns the page when you reach the bottom. We might also see VR apps make use of eye tracking + voice control as a method to navigate the VR world. And of course we’ve already seen rumours that notifications might be silent if your phone detects that you are looking at it.

As with so many previous iOS features - I think most of the potential here will be discovered by creative developers that find new and interesting ways to enhance their apps with this tech.

TL:DR - iPhone facial recognition might not just replace TouchID, it might create entirely new ways to interact with apps.