Not too long ago, tech giants like Apple and Samsung raved about the number of megapixels they were cramming into smartphone cameras to make photos look clearer. Nowadays, all the handset makers are shifting focus to the algorithms, artificial intelligence and special sensors that are working together to make our photos look more impressive.

What that means: Our phones are working hard to make photos look good, with minimal effort required from the user.

On Tuesday, Google showed its latest attempt to make cameras smarter. It unveiled the Pixel 4 and Pixel 4 XL, new versions of its popular smartphone, which comes in two screen sizes. While the devices include new hardware features — like an extra camera lens and an infrared face scanner to unlock the phone — Google emphasized the phones’ use of so-called computational photography, which automatically processes images to look more professional.

Among the Pixel 4’s new features is a mode for shooting the night sky and capturing images of stars. And by adding the extra lens, Google augmented a software feature called Super Res Zoom, which allows users to zoom in more closely on images without losing much detail.