On September 12th, Apple will hold an event in their all-new Steve Jobs Theater located on the grounds of their all-new spaceship campus to unveil some all-new iPhones. Bloomberg’s Mark Gurman has written a number of articles detailing what to expect (and the latest seems pretty spot-on) but in general, here’s what most analysts think will be unveiled:

Two new iPhones with beefed up processors that mostly look the same as the current iPhone 7 and iPhone 7 Plus. Maybe called the 7s and 7s Plus, or maybe the 8 and 8 Plus. Who knows.

A redesigned high-end iPhone with an edge-to-edge, taller OLED screen, no home button, and new front sensors to enable face unlocking to replace Touch ID. Maybe called Premium, Edition, Pro, X, etc.

Much has been said about the huge updates in iOS 11 (huge for iPad, less so for iPhone) but few articles have really dug into what face unlocking would mean, beyond the obvious that you’ll be able to verify your identity and unlock your phone with your face.

I think this will be the flagship feature of the new iPhone, and will let Apple leapfrog competitors with futuristic face-scanning sensors that will have a gigantic impact on the future of augmented reality.

Face Unlock Technology, Apple Style

Some Android phones have had face unlocking since 2011. Essentially, the phone takes a 2D photo of your face using the front-facing camera, and uses software to compare it mathematically to a previously-taken photo that the user has deemed to be of themselves. Since it launched, Android face unlocking has been both slow and insecure, and even Samsung’s flagship S8 phone can be duped into unlocking by holding a photo up to it so it’s obviously not a very good piece of technology.

And there is absolutely no way that Apple would include face unlock on their new top-end iPhone if it could ever be defeated with a photo of your face instead of the real thing.

For the past few months there’s been a lot of chatter about this technology, but it’s usually buried in a larger article about the iPhone’s other rumored new capabilities.

From a Bloomberg article written in July:

Apple is testing an improved security system that allows users to log in, authenticate payments, and launch secure apps by scanning their face, according to people familiar with the product. This is powered by a new 3-D sensor, added the people, who asked not to be identified discussing technology that’s still in development. The company is also testing eye scanning to augment the system, one of the people said.

A brand new 3D depth sensor that can also track eye movements.

What about the speed?

The sensor’s speed and accuracy are focal points of the feature. It can scan a user’s face and unlock the iPhone within a few hundred milliseconds, the person said. It is designed to work even if the device is laying flat on a table, rather than just close up to the face.

Super fast. As fast or faster than Touch ID. Again: if it were slower, Apple wouldn’t green-light it.

And from a WSJ article on this new sensor from August:

Depth-sensing technology, generally called “structured light,” sprays thousands of tiny infrared dots across a person’s face or any other target. By reading distortions in this field of dots, the camera gathers superaccurate depth information. Since the phone’s camera can see infrared but humans can’t, such a system could allow the phone to unlock in complete darkness.

Infrared dots that work in low-light (or no light) that scan the user’s face to generate a super accurate 3D depth map.

Let’s dig deeper into the eye-tracking bit up above. Is there anything to indicate that Apple is doing something really big there? Yup, Apple acquired SensoMotoric Instruments earlier this year, a company that has been building serious eye-tracking technology for over 20 years.

SensoMotoric Instruments, founded in 1991, has developed a range of eye tracking hardware and software for several fields of use, including virtual and augmented reality, in-car systems, clinical research, cognitive training, linguistics, neuroscience, physical training and biomechanics, and psychology.

More about their eye tracking technology:

The company’s Eye Tracking Glasses, for instance, are capable of recording a person’s natural gaze behavior in real-time and in real world situations with a sampling rate up to 120Hz. […] SensoMotoric has also developed eye-tracking technology for virtual reality headsets such as the Oculus Rift, which can analyze the wearer’s gaze and help to reduce motion sickness, a common side effect of VR. The solution can also allow for a person’s gaze to control menus or aim in a game with their gaze.

Real-time 120Hz tracking of eye movements at such a detailed level that users can control software interfaces merely by glancing at visual targets.

But wait! At WWDC this summer, Apple introduced a number of APIs that give developers more advanced facial recognition abilities:

Vision Framework allows you to detect face rectangle and face landmarks (face contour, median line, eyes, brows, nose, lips, pupils position)

Beyond this announced API, it was also uncovered that the new iPhone will know if you’re looking at it and will suppress notification sounds.