Last month, we took a look at what is new in the iPhone 11 and 11 Pro’s camera hardware. You might’ve noticed two things from Apple’s iPhone announcement event and our blog post: the hardware changes seem fairly modest, with more attention directed at this generation’s software based processing.

It’s true: The great advances in camera quality for these new iPhones are mostly to blame on advanced (and improved) software processing.

I’ve taken some time to analyze the iPhone 11’s new image capture pipeline, and it looks like one of the greatest changes in iPhone cameras yet.

What is a photo?

That sounds like we’re off to a rather philosophical start, or delivering the punchline of an iPad photography commercial, but to highlight what makes the iPhone 11 camera unique we have to understand our expectations of photography.

For a while now, you haven’t been the one taking your photos. That’s not a slight at you, dear reader: When your finger touches the shutter button, to reduce perceived latency, the iPhone grabs a photo it has already taken before you even touched the screen.

This is done by starting a sort of rolling buffer of shots as soon as you open the Camera app. Once you tap the shutter, the iPhone picks the sharpest shot from that buffer. It saves a shot you, the user unaware of this skullduggery, assumes you have taken. Nope. You merely provided a hint, to help the camera pick from the many shots it had taken on its own.

We can argue this is still deliberate photography. Without your action, there would be no photo.