According to Walter Issacson's authorized biography about former Apple CEO Steve Jobs, one of the things Jobs wanted to "revolutionize" was photography. Jobs believed the iPhone was a vehicle for doing so, but current imaging technologies limit the photographic abilities of smartphones. As detailed in a new book by Fortune's Adam Lashinsky, Inside Apple, Jobs may have found the solution he was looking for in a radical imaging technology from Lytro. To that end, Jobs apparently met with Lytro CEO Ren Ng in June 2011 to discuss how Apple might integrate Lytro's light field technology into its products.

One aspect of the iPhone that has received constant improvements over the years is its included camera. The original iPhone had a fixed focus lens and a 2MP sensor, while the iPhone 3G was upgraded with autofocus capabilities and 3MP. The iPhone 4 moved up to 5MP and added an LED flash and 720p video. The iPhone 4S went even further, moving up to 8MP, improving low-light capture, and moving to full 1080p HD.

Those changes were all evolutionary, but they have made the iPhone one of the most popular cameras for capturing images on sites like Flickr.

Jobs apparently wanted to improve the camera in a way that would change users' expectations of photography, and he believed Lytro's light field capture could do just that. At Jobs' behest, Ng flew out to Palo Alto to meet with Jobs and discuss cameras. Ng then agreed to send Jobs an e-mail detailing multiple ways Lytro could work with Apple on future products.

Lytro's technology relies on capturing far more information about a scene than a fixed grid of colored pixels. Using high resolution sensors combined with a specially designed micro lens array, the sensor captures the intensity, color, and direction of light rays entering a camera through a lens. That data can then be processed into the kind of flat, two-dimensional image that many of us are accustomed to.

However, that data can be mathematically manipulated to change various aspects of the image, including focus point, focal length, depth of field, and even perspective shift. All these details can be recalculated after the image is captured, removing the need to think about them while shooting.

While the details of that e-mail weren't included in Lashinsky's book, stuffing Lytro's light field capture technology into an iPhone would be a revolutionary move if it were to come to be. The iPhone's relatively bulky autofocus lens mechanism could be replaced with a sharper, more compact, and less prone-to-damage fixed focus lens. Without having to wait for the lens to focus, images can be taken even quicker, capturing what Cartier-Bresson called the "decisive moment."

At the same time, users would not have to give up the benefits of selective focus. The iPhone's camera software already has a UI for that—tap a point and the software can change the focus point as needed. Essentially, if users can get their subject into the frame, it's possible to "perfect" the image later. Just swipe to adjust perspective, tap to change focus, move a slider to increase or decrease depth of field. It's even possible to generate true 3D images from a single exposure using software alone.

We recently had a chance to use Lytro's upcoming standalone camera first-hand. If a future iPhone includes a similar light field capture sensor, we believe it could potentially revolutionize the way many people take and experience photos while on-the-go.