At an event in London yesterday, LG unveiled the G3, its latest flagship offering. The handset is powered by a Snapdragon 801 System-on-a-Chip and sports a 5.5-inch Quad HD (2,560 x 1,440) IPS LCD display, giving it what is probably one of the most pixel-dense displays to have ever been put on a smartphone. Besides the display, however, there’s another little piece of tech that probably no other smartphone out there boasts, and it’s the new Laser Autofocus system LG has incorporated to power the G3's 13MP OIS+ camera. Yeah, I know “laser” sounds cool and all, but don’t fall for the puffery: the tech — also known as Active Autofocus — isn’t new. Not even in the slightest. It’s just new on a smartphone. LG claims that with its Laser Autofocus system that’s there in the G3, the camera takes less than a third of a second to focus. While I’ve yet to see if LG’s claim holds water with me, I thought I’d explain the tech anyway as it’s pretty straightforward and I hate to see people be swayed by such “bullet-point” marketing ploys.

Passive Autofocus: How Traditional Smartphone Autofocus Works

Traditionally, smartphone cameras feature what is called as Passive Autofocus. It’s called Passive because this technique doesn’t require any extra piece of hardware to do its magic. This kind of autofocus works by performing smart algorithmic analysis of the image — the image you see in the viewfinder (in this case, your smartphone’s display). The thing in the image that’s being analysed is contrast patterns and that’s how it is determined if the image is in focus or not. Based on the analysis, the lens system is moved either forward or backward (with the help of minuscule motors or, more recently, using microelectromechanical tech). The image is then analysed again, the lens system moved accordingly, and this goes on until the software is satisfied that the image is completely in focus.

But what’s the logic behind the technique? Here’s a really simple example to understand it. Maybe I’ve oversimplified it a bit, but bear with me. Have you seen one of those images that show street-lights out of focus? They’re a very popular kind of wallpaper, so I’m pretty sure you must’ve. Let us take such an image. In this image, sharp colourful light from these street-lights is brought out of focus, which thoroughly dilutes its intensity and makes what would otherwise be sharp bright colourful points in the image large blur-circles with faded colour. Take any point in this out-of-focus image; there’s little to zero immediate contrast (i.e. little to zero change between that point and its immediate neighbour). Even the overall contrast shows very, very gradual gradient. This is the calling card of an out-of-focus image. Now imagine the same image in complete focus. Whilst analysing contrast patterns in the in-focus image, the software will eventually hit one of the several aforementioned bright colourful points — something you’d expect to get in the image from a faraway point-like source of light in focus — and there’d be a massive peak in contrast. This means the image is in focus. Passive Autofocus analyses images like this again and again and moves the lens system each time based on what it finds until it gets a contrast pattern like this.

This kind of trial-and-error technique means that there’s some amount of latency you have to put up with. Not only that; this method is strictly algorithmic and can fail miserably in extremely low-light situations, as there’s little to no contrast in such conditions for it to do its magic (although this can be remedied with the inclusion of an additional, “AF” lamp).

One more thing I’d like to clarify before we move on to Active Autofocus. Analysing contrast patterns is just one of the two popular Passive Autofocus techniques. The other technique is called as Phase-Detection Autofocus, which is found in Samsung’s Galaxy S5, although it’s just one part of an overall hybrid autofocus system, the other part being your regular contrast-based autofocus system that we discussed above.

Active Autofocus: How Laser Autofocus Works

Laser Autofocus is an example of what is called as Active Autofocus, or more specifically LG’s specific implementation of it. Active Autofocus works in a far more straightforward way than Passive Autofocus. It’s called Active because it works independently of the optical system and hence requires extra hardware — it requires an emitter and a receiver.

To put it as simply as possible, Active Autofocus works in precisely the same way as SONAR/RADAR. Basically, a beam of ultrasonic or electromagnetic waves is shot at the subject by an emitter and a receiver waits for the reflected beam to arrive. The time between emission and reception is calculated, and since the velocity of ultrasonic/electromagnetic waves in the given medium (i.e. air) is already known, the distance between the camera system and the subject can be easily calculated using the echo formula (v = 2d/t). After gaining knowledge of the distance of the subject from the camera, the system moves the lens system accordingly. And with that, you’re done. Pretty neat, innit? Furthermore, this kind of autofocus works just as well in low-light situations, quite unlike contrast-based Passive Autofocus systems.

LG’s Laser Autofocus system has an emitter that shoots an Infrared laser beam. “Laser”, while also serving to add some mega marketing punch, simply means that the beam is collimated and doesn’t disperse. LG also claims that their Laser Autofocus takes less than a third of a second to complete its job, making it one one of the fastest Autofocus systems in a smartphone. But like I said before, whether their claims hold any water is yet to be seen.

But anyway, that’s basically how the LG G3’s Laser Autofocus works.