Later, on the phone, Souza said, “I think they’ll use it and not really understand it.” But he added that consumers will understand the results and see how “when they go in one direction, everything other than what’s in focus gets less in focus and the other way things get more in focus.”

“I love this decision by the team to honor art of photography and the work that went into characterizing how great lenses work,” said Apple senior vice president of worldwide marketing Phil Schiller when I asked about the decision to include f-stop numbers in the depth editor interface.

Schiller, along with Graham Townsend, Apple’s senior director of camera hardware, and Sebastien Marineau-Mes, Apple’s vice president of software, sat down late on the afternoon of iPhone XS launch day to peel away the veil of secrecy surrounding at least one part of Apple’s iPhone technology matrix: how they design and develop their photo and video capture hardware and software.

The numbers consumers will see on these phones and through the photo editing app are not just an old-school nod to how f-stops and aperture control work on DSLR cameras. Schiller told me Apple engineered an exact model of how a physical lens at those aperture numbers would behave.

Watch the blur.

In a physical camera, a higher-number f-stop represents a smaller aperture opening and a longer depth of field. In other words, a setting of f1.4 would have the front of a subject’s face in focus while the background is fuzzy. On the other hand, a setting of f16 puts almost everything front and back in focus.

The first expression of this kind of photography on smartphones appeared in 2016 with the iPhone 7 Plus and its portrait mode, which used the two images grabbed by its dual-lens system (and some algorithmic magic) to create a background-blur, or bokeh, effect. This on its own was a radical innovation for amateur iPhone photographers by transforming mundane portraits into studio-quality images. Even so, it — like virtually all other smartphone-based portrait-mode photography that followed — was a two-plane version of the depth effect. The images held the foreground object in focus and blurred the back plane. Samsung was the first to introduce adjustable blur that could be used during photography and in post-processing, but Samsung’s Live Focus still sees the image as two planes.

Like a lens

What’s clear from using the new iPhone XS and XS Max is that the depth slider captures almost unlimited planes between the foreground and background. Apple calls this “lens modeling.”

“We turned the model of a lens into math and apply it to that image,” explained Marineau-Mes. “No one else is doing what this does. Others just do ‘blur background.’” And the post-processing works equally well whether you’re taking a selfie with the iPhone XS single, 7-megapixel front-facing camera, a portrait with the dual-lens system on the iPhone XS or XS Max, or a photo with the single 12-megapixel rear camera on the iPhone XR.

Put simply, Apple is employing three distinctly different depth-information-capturing technologies to drive the same depth editing result. Townsend described it to me as using three different sources of information: the dot-based depth sensor in the TrueDepth module, the dual-lens stereo imagery of the 12-megapixel cameras on the back of both the XS and XS Max, and an almost entirely algorithmic solution on the XR.