Apple will announce new iPhones tomorrow, September 10th, and this year’s event will have all of the usual trappings of an iPhone introduction, I’m sure. Expect a teaser video that shows you up-close shots of different parts of the phone before revealing the whole thing. Expect lots of talk about how powerful the processor is. Expect dunks on how Android never seems to be updated. Expect some kind of whizbang AR demo.

Most of all, expect a lot of talk about the camera. Because the most notable expected change to the basic iPhone X-series shape of the phones is that it will have a big, square camera module with extra lenses — and everybody is already focused (pardon the pun) on it.

Chaim Gartenberg has already written up all the news on what to expect tomorrow. So unless there is some sort of surprise, it feels like the only things we don’t know yet are the quality and capabilities of the iPhone 11 Pro’s camera system.

Smartphones — especially iPhones — have been big, fast, beautiful, and well-made for years now. So in some ways, the camera is the only thing we can definitely expect to improve year over year, at least in a way that’s genuinely important to users.

We all hope for more battery life, too, but the chemistry of batteries seems to be a harder challenge than the physics of light. At least with light, you can apply computation to improve the image.

It might be a little boring to see such relentless focus on the camera, but that’s where Apple’s biggest opportunity for improvement still lies. Maybe it’s too much to hope for, but this year, I would like to see Apple do more than just improve the camera a little bit. I’d like to see something like a generational change, a step up over what exists on any phone today.

The rumors point to a three-lens system on the “pro”-level iPhones: a regular, a telephoto, and a wide-angle. This system has been de rigueur for Android phones for roughly the past year. Android phones have also leapt ahead of Apple when it comes to low-light photography and computational photography.

In both low-light and computational photography, it’s been Google’s Pixel that’s led the industry, and both cases are examples of thinking more broadly about what a camera sensor is. Google sees that sensor as a source of information more than a source of light, and it has been more aggressive at finding creative ways to use algorithms to manipulate that information into something pleasing.

Apple is expected to move a little in that direction. Rumors suggest it might use the information from the wide-angle lens to improve (and, in some cases, completely save) photos from the regular lens. It’s also widely assumed that Apple will close the gap with the Pixel by introducing some sort of night mode.

All of that is fine. In fact, if Apple doesn’t at least match what’s widely available on Android, it will be a disappointment. But I am hoping for something more.

I was recently speaking to a photographer who just upgraded from the iPhone 5S to the iPhone XS. I said something about how she must be so happy with the huge leap the cameras have made in those generations. She looked at me with bewilderment and replied, “It’s the same!”

Of course, she knows it’s not. But in a very real sense, she’s not wrong. The very best smartphones take photos that are so good that many people can’t tell they weren’t taken with a decent point-and-shoot camera. But blow them up on a big screen or really zoom in on the pixels or check how they handle difficult lighting situations, and you can almost always tell. You can certainly tell if you ever print them out.

The best smartphone photograph still looks like a smartphone photograph. I don’t want to denigrate these photos, either. Some of them are amazing, worthy of a billboard or an art gallery, but they are amazing smartphone photos.

My colleague Nilay Patel and I have been preaching the same sermon for a few years now: if all you really want are the best photos possible, instead of spending your thousand bucks on a smartphone that takes 10 percent better photos, spend it on a really good compact camera.

It’s another thing to carry around, sure, but it might be worth it. Once you start seeing the difference between smartphone photos and camera photos, it’s difficult to unsee it.

I’d like Apple to make a camera system that shuts that argument down — or at least makes it harder to defend.

Apple has been weaning us off of its traditional two-year cadence. It used to be that we’d get a big new redesign one year, then an S model the second year, and then the year after that, we’d repeat the cycle. No more. And you could argue that it hasn’t been that way since the iPhone 7.

This year, we’ll still get phones that clearly belong in the lineage of the iPhone X and iPhone XR — and that’s fine. But if Apple wants us to think of these models as more than just another S-iteration of the same old thing, it needs to justify those big, square camera modules.

I’m hopeful the camera will be a big step change, but not so much so that I’m demanding or even expecting it. The physics of light is hard, and squeezing more out of the sensor with the clever use of algorithms isn’t much easier.

I’m sure I’ll be wowed by whatever these cameras can do. But to convince me to upgrade from my iPhone XR, they need to make me regret buying a Sony RX100 standalone camera, at least a little.