Many of us iPhone photographers have watched as other phones like the Pixel and the Huawei P30 have passed us in low light. It feels so good to see the iPhone 11 Pro has caught up in low light performance.

In the audience image above, notice the preservation of skin tone warmth and color, even in the super dim blue light being cast from the stage. I was really impressed by Night mode’s overall preservation of color — in some scenarios, the iPhone 11 Pro pulled out colors in low light scenes I couldn’t even see with my naked eye.

I think I can say this is easily the most dramatic leap forward we’ve seen since the introduction of panorama mode on the iPhone 5 in 2012. It’s the first time in a long time I’ve looked at an image and said to myself “wow, I can’t believe I shot this with my phone.”

How Night Mode Works on iPhone 11 Pro

(skip if you don’t care about how geeky things work)

I’ve been caught off guard by the ability to handhold multi-second Night mode shots and maintain sharpness even while in a moving car on a bumpy road or shooting Huang Gaohui on a rocking boat.

If you are a pro familiar with shooting long exposures, you'll immediately realize something is fundamentally different about how the iPhone 11 Pro collects light in Night mode.

From what I understand, the way Night mode actually works is the camera captures a bunch of short exposures and slightly longer exposures, checks them for sharpness, throws out the bad ones and blends the good ones. On a traditional dSLR/mirrorless camera, a 5 second exposure is one single, continuous recording of the light throughout the duration of the shutter so any movement (of subject or camera) is recorded.

But with iPhone 11 Pro the rules are different… it’s not capturing one single continuous frame but blending a whole bunch of shots with variable lengths (some shorter exposures to freeze motion and longer shots to expose the shadows.) This means the subject can actually move during your exposure but still remain sharp.

I’m sure some of you are wondering, “well this is cool for handholding but what if you want to do light trails?” The iPhone actually detects when it is on a tripod and changes exposure method so that light trails and movement can still be captured.

This new way of thinking took me a good bit of testing/questioning to really figure out what is going on and it is yet another place where the computational side of photography really shines, leveraging powerful software — instead of a big lens with big glass — to capture more light.