Other than Sony (which owns a movie studio), smartphones and movies don’t have much of an overlap. That was the case until a major push from studios tried to make 3D movies popular. Which led phone makers to attempt to bring 3D to your pocket.

In 2011 both LG and HTC unveiled 3D phones – the Optimus 3D and EVO 3D (Sharp made one too). Not only did they have screens with autostereoscopic 3D (i.e. you don’t need special glasses like you do in the cinema), they also had 3D cameras to shoot your own 3D photos and videos.

That was the first time a smartphone had a dual camera setup. And it was a fad that died of quickly.

It took until 2014 for the next dual camera setup to appear and it was HTC again with the One (M8). The second camera sensor added depth to photos not with 3D but with bokeh – the shallow depth of field previously achieved only with large sensors and fast lens on DSLR cameras.

This depth effect camera is one of the more popular flavors even to this day. It’s fairly easy to implement and it doesn’t require a high-res sensors, entry level phones make do with as little as 2 MP. But as Google showed, you can do bokeh with a single camera (Samsung has an implementation too, though we think the Pixels are more successful).

Things changed in 2016 with the LG G5, which picked a wide-angle lens for its secondary camera. Those are quite rare even though we think they are among the more useful dual camera setups. You could use panorama mode, but that’s for photos only and even then it doesn’t handle moving objects very well.

Later that same year makers went the other way with telephoto lenses, notably Apple with the iPhone 7 Plus. If Apple does, you can be sure other will jump on board as well. “Portrait mode” in particular caught on quickly and it uses both cameras on the phone.

Huawei and Leica had a different take and added a monochrome camera on the Huawei P9. Basically all digital cameras are monochrome – image sensor pixels are color blind. To get around that, most cameras feature a Bayer filter and do image processing. That reduces the light sensitivity and it hurts image quality a bit since color info has to be interpolated.

This year Huawei decided that two cameras just won’t do and launched the P20 Pro, the first phone with a triple camera. It has a regular cam, a telephoto cam and a monochrome cam. We touched on Bayer filters and the demosaicing process here.

We think that with improving image sensor technology and ever-brighter apertures, the monochrome secondary camera doesn’t have much of a future. However, we think that the bokeh camera will be the first to go away. Google can certainly sped things up by opening up its bokeh algorithms.

That leaves zoom, either telephoto or wide-angle. Those are nearly impossible to replicate with clever sensors and software trickery, though Nokia’s PureView sure showed good quality zoom with just one sensor (a big one).

We think that in some occasions progress is being impeded by marketing hype. OnePlus, for example, has produced a second phone with a dual camera that matches the field of view of the main camera (so no zoom). And it’s not a monochrome sensor either. For all the talk of portraits and low-light shooting, we don’t think it achieves much that can’t be done with a single camera.

As you can probably tell, we think that zoom cameras are the ones that will win out in the end. Although Huawei managed to fit a 1/1.7” image sensor in a 7.8 mm phone, the Nokia Lumia 1020 used a 1/1.5” sensor. So maybe an awesome single-sensor camera is still possible, but two is still better than one.