THE pill-sized cameras in today’s mobile phones may seem miraculously tiny, given that a decade ago the smallest cameras available for retail sale were the size of a pack of cards. But Ali Hajimiri of the California Institute of Technology is unimpressed. In his opinion even these phone cameras are far too thick (witness the optical bump on the back of most mobile phones), so he and his team plan to replace them with truly minuscule devices that spurn every aspect of current photographic technology. Not only do Dr Hajimiri’s cameras have no moving parts, they also lack lenses and mirrors—in other words, they have no conventional optics. That does away with the focal depth required by today’s cameras, enabling the new devices to be flat. The result, he hopes, will be the future of photography.

Brave words. But, as an inventor, Dr Hajimiri has form to back them up. In 2002 he helped found a firm (now taken over by a bigger one) to build power amplifiers for mobile phones. More than 250m of these have been made. In 2004 he came up with the world’s first radar on a chip, which is now being used in prototype self-driving cars. To round things off, in 2012 he created an all-silicon imaging system that uses the terahertz part of the electromagnetic spectrum (which is slightly higher in frequency than radar) to see through objects opaque to light. This system has found employment in everything from medical-diagnostics equipment to security scanners.

The latest venture moves his focus to higher frequencies still than terahertz waves—those of visible light. The new camera, known as an optical phased-array receiver, or OPA, collects the light from which it forms its image using a grid of devices called grating couplers. The prototype (the blue structure pictured above, attached to a thick mounting block to make it easier to handle) has 64 of them. Grating couplers are optical antennae. They collect light and send it to a device called a waveguide. This carries light around in a way analogous to a wire carrying electricity.

Flash, bang, wallop. What a picture

Each grating coupler is tiny—about five by two microns (millionths of a metre)—and so picks up only a minuscule amount of light. That signal has to be amplified. This is done by heterodyning, a process which combines the light in the coupler with a minute laser beam, strengthening the signal at the desired wavelength.

To mimic the image-making role of the optics in conventional cameras, the OPA manipulates incoming light using electrons. Dr Hajimiri compares the technique to peering through a straw while moving the far end swiftly across what is in front of you and recording how much light is in each strawful. In the OPA this scanning effect is created by manipulating the light collected by the grating couplers electronically, using devices called photodiodes. These place varying densities of electrons into the amplified light’s path through the OPA, either slowing it down or speeding it up as it travels. That shifts the arrival times of the peaks and troughs of the lightwaves. This “phase shifting” results in constructive interference between waves arriving from the desired direction, which amplifies them. Light coming from other directions, by contrast, is cancelled through destructive interference. Change the pattern of electrons and you change the part of the image field the OPA is looking at. Scanning the entire field in this way takes about ten nanoseconds (billionths of a second).

The photodiodes, then, determine where the camera is pointing without any mechanical movement being needed. They also permit the camera to capture different kinds of images, such as close-ups and fish-eye views. To zoom in for a close-up, the device selects a specific part of the image and scans it more thoroughly. To zoom out for a fish-eye, it scans the entire optical field, including light from the edges of that field. To change from zoom to fish-eye takes nanoseconds.

The processed optical signal is then passed down the waveguide to further photodiodes. These convert it into an electrical signal, which is used to create the final photo. Crucially, all this can be achieved in a stack of electronics five microns thick—about a fifteenth of the diameter of a human hair.

The exact size of any production version will depend on the job to be done. The prototype can manage fuzzy images of barcodes, but not much else. To achieve the same resolution as the camera in a modern Apple iPhone, Dr Hajimiri reckons an array of about 1m grating couplers will be needed. Allowing for the space between these, the result would, at the moment, have an area of 1cm2. This is similar to the area of an iPhone’s camera, but that camera is 1,000 times thicker. Dr Hajimiri thinks, moreover, that a production version of the new device would be smaller.

He concedes that there are challenges: improving the optical performance of the elements; suppressing spillover effects between different signals in the device; and honing the algorithms that calibrate the camera’s performance. But all these matters, he believes, can be dealt with and he envisages his lensless cameras being commercially available within five years.

Such tiny cameras would have uses far beyond eliminating the optical bumps from mobile phones. They might be deployed, Fantastic Voyage-like, to take pictures inside blood vessels. Conversely, they could be combined into massive arrays to create lightweight but extremely large-aperture telescopes able to resolve images from the deepest parts of the universe. They might even be strewn to the winds, photographic dust particles scavenging the energy they need from stray radio signals, and broadcasting what they see. Or they could be attached, almost invisibly, to walls, to act as spies.

In the “Ringworld” series of science-fiction novels, the books’ author, Larry Niven, envisages spray-on devices called “webeyes” that can be applied to any surface, and used for such espionage. Cameras of the sort Dr Hajimiri is developing are scarily close to making that idea real.