Lidar imaging has been around for almost as long as the technology it's based on, the laser. But unlike its more famous cousin, radar, it was mostly used for research purposes. The reason scientists know so much about the density of aerosols in the upper atmosphere is largely due to the practice of shooting powerful lasers into the atmosphere and examining the return signal. That sums up the key difference between lidar and radar: lidar operates with a shorter wavelength so it can, in principle, detect and (sometimes) image smaller objects, like aerosol particles.

This difference has now been given a spectacular demonstration, with researchers imaging the profile of an air rifle bullet in flight with a resolution of about one micrometer (an air rifle bullet is about 5mm long). While air rifles have a rather low muzzle velocity, the researchers could have imaged the bullet from a firearm with a very high muzzle velocity and still had a resolution of about 10 micrometers.

Two lasers that are not quite twins

Old-fashioned lidar systems (and even newer lidar ones) work on the tried-and-true principle of time of flight. Basically, you send out a pulse of light and record the time it takes to receive an echo. This is a pretty simple system, provided you don't want very good distance accuracy.

To put it in perspective, a friend of mine built a lidar system for aerosol measurements. His laser emitted pulses that were about five nanoseconds (a nanosecond is a billionth of a second) in duration, which gives him, at best, a distance resolution of about one meter. Considering that he was studying aerosols in the upper atmosphere, a resolution of one meter was just fine.

It is also not so difficult to make light pulses that are considerably shorter. I once worked with a laser that had a 35 femtosecond pulse duration (a femtosecond is a million-billionth of a second, 10-15s), which would, if I had used it for lidar, given me a distance resolution of about five micrometers. Except it wouldn't have. The problem is that electronic light detectors are not fast enough. At the detector side, the 35fs pulse is recorded as a 0.5ns pulse, some 15,000 times longer. If the pulse bounces off an object like a bullet, I can see the reflection, but the change in pulse shape, which holds information about the bullet's shape, is washed out by the slow electronics.

We have built good optical systems that can measure changes in distance in the picometer range (10-12m) using interferometers—if you take gravitational wave detectors as your standard, we can do 10 million times better than a picometer. The difference between these systems and a time-of-flight lidar system is that the interferometers compare light that reflects from the object to light that travels a fixed distance. We measure the difference in the two distances by the interference between the two light beams.

The only potential hangup is that if the object shifts its distance by exactly one wavelength, the signal from the interference looks exactly the same. Over short distances, like in a microscope, this can be dealt with. But for long distances and big objects, the problem of determining distance changes that are greater than the wavelength of the light become really tough.

The solution, it seems, is to combine time-of-flight and interferometry.

Lasers that produce very short pulses of light are very special. In terms of time, they produce a very regular series of short, sharp light pulses. But if you look at the light that the laser emits, it is not a single color. Indeed, if you were willing to sacrifice an eye by looking at the output of such a laser, your eye would (briefly, before going blind) perceive it as like the white light from a desk lamp.

Appearances would be deceiving. While the laser emits many, many colors, it doesn't produce a smooth spectrum. Instead, each color is a sharply defined pure color, and each is separated from its neighbor by a fixed frequency. It is this property—a light pulse that is made up of a huge number of distinct and precisely separated colors—that provides the key to accurate ranging.

Lights, camera, action

To actually carry out measurements at this precision required a whole new laser system. And the researchers actually use two copies of the same laser system. I won't go into details here, other than to say that the technique they use generates pulses that are both very short and have a very high pulse repetition frequency—about 100GHz, though the two lasers have slightly different repetition frequencies. They also have a slightly different spacing between the colors.

The measurement procedure is a bit complicated, but here's an outline of it. We have two lasers that I will call the ranging laser and the reference laser. If those two light beams are shone together on a photodetector, the result depends on whether the pulses overlap in time. When the pulses from the two lasers do not overlap, the photodiode puts out two microwave frequency signals—one corresponding the gap between colors in the ranging laser and the other from the gap in the reference laser. When the pulses from the two lasers do overlap, two additional radio frequencies appear: these two new frequencies correspond to the separation between the colors from the two different lasers.

The lasers have slightly different pulse repetition frequencies, so over time, the output of the photodiode oscillates from producing two signals to producing four and back again. The phase of this oscillation tells us about the overlap between the two pulses.

And this is exactly what the researchers use to determine distance to an object. The ranging laser is reflected off an object, and the return signal is mixed with the reference laser. Depending on the overlap between the two pulses, the distance to an object can be determined to an accuracy of between 250nm and 12nm (depending on how long the researchers average for). In fact, it depends on the phase of this overlap oscillation compared to a reference signal that is generated by light that never leaves the system.

How fast can you update the position of on object? That depends on the speed of the overlap oscillation, which is about 100MHz, meaning an update every 10ns. Averaging increases the accuracy while sacrificing imaging speed. At the highest accuracy, the position is updated every 13 microseconds.

The lidar system combines the best of a time-of-flight lidar with the best of interferometric sensors. Unfortunately, it also combines their disadvantages. In the current setup, the pulses from the ranging laser are separated by about 1.5mm. If the distance to an object is a multiple of that, then the signal looks exactly the same. There are ways to fix that, but those improvements will have to wait for a future iteration.

You’ll need more than a gold card for this

This technique requires some impressive equipment. The lasers themselves are pretty mundane and are based on components widely used in optical fiber-communication systems. The very short pulses are generated by guiding light in a little ring, called a microring resonator, in a glass chip. Although these particular microring resonators are not commercially available, they are made using standard fabrication techniques—a wafer of them is expensive, but it has a lot of rings.

It's the light detector—and the way the researchers detect and process the radio frequencies from the light detector—that takes my breath away. To use this for one of lidar's most common applications, we would be talking about doubling the cost of your dream autonomous vehicle.

The researchers sketch an artist's impression of how a useable system might look. In their picture, the whole system is powered by a couple of diode lasers—slightly more expensive versions of the lasers used in CD players. These are coupled directly to the devices that create the very short pulses—the little chip of glass that contains microring resonators—and from there, the light goes out to the ranging system.

The researchers suggest that all of the expensive detector and signal processing equipment can be replaced with a field programmable gate array (FPGA) and an analog-to-digital converter. In principle, that is correct. But the detector is the reason why their digital sampling scope (nothing more than a few FPGAs, an analog-to-digital converter and a gold-plated price tag) is so expensive. I honestly don't think this is going to be a reasonably priced lidar system any time soon.

The other bit that the researchers mention in passing is that this system cannot currently be powered by laser diodes. In fact, in the research system, they use a reasonably powerful laser to drive it. The researchers think that this issue can be solved by improving the fabrication of the glass rings. I know companies that have been working on that problem for some time, and improvements are slow and come at the cost of a lot of failed ideas.

In short, this work is beautiful, but it might be awhile before you can be drive off the lot using it.

Science, 2018, DOI: 10.1126/science.aao3924