The Nexus 5 camera was a huge disappointment, especially after comments from high-ranking Googler Vic Gundotra stating that "we are committed to making Nexus phones insanely great cameras. Just you wait and see."

That was nine months ago. We waited and saw, and what showed up on the Nexus 5 wasn't very good. There may be an explanation for this, though. According to commits in the public Android source code, which were first spotted by Josh Brown on Google+, Google is working on a new camera API for Android. Work on the new API started in December 2012, which would make it seem targeted for KitKat, but about a month before the new OS's release, the API was pulled from Android's framework code. The commit that removed the API from the release Android code is here, with the comment saying:

DO NOT MERGE: Hide new camera API. Not yet ready. Bug: 11141002

This commit was pushed on October 11, about a month before the release of KitKat. A month before release was probably "feature freeze" time, where work on new features stops and everyone focuses on fixing bugs in time for release. The camera revamp didn't make it and was replaced with the original camera API.

The really good stuff is in the initial commit, which contains tons of documentation about the new camera setup. There's a new API class called "Android.hardware.photography" (the current camera functionality lives under "android.hardware.camera"), and with the fancier name comes fancier capabilities:

Full-capability devices allow for per-frame control of capture hardware and post-processing parameters at high frame rates. They also provide output data at high resolution in uncompressed formats, in addition to compressed JPEG output.

The new camera API has a backward-compatibility mode for older devices, but "full-capability" devices now have access to a few new picture formats. The only new image format listed that isn't present in Jelly Bean seems to be support for camera RAW:

General RAW camera sensor image format, usually representing a single-channel Bayer-mosaic image. Each pixel color sample is stored with 16 bits of precision. The layout of the color mosaic, the maximum and minimum encoding values of the RAW pixel data, the color space of the image, and all other needed information to interpret a RAW sensor image must be queried from the {@link android.hardware.photography.CameraDevice} which produced the image.

Smartphone cameras normally output JPEG files, which are compressed, mostly finalized images. RAW is minimally compressed and unprocessed, so shooting in RAW gives the photographer much more flexibility after the picture is shot. Programs like Photoshop can do much more with a RAW file than with a JPEG.

Camera RAW is not totally unheard of on a mobile phone; Nokia's upcoming Lumia 1520 will be able to shoot RAW, for instance. Besides making Photoshoppers very happy, the RAW file could be passed to an even more powerful on-board photo editor, which Google seems very keen on building into Android and Google+.

The new API also supports face detection. This feature includes bounding boxes around faces and center coordinates for the eyes and mouth. In addition to the face-focus capabilities, the system can assign unique IDs to each face (provided they stay on screen) so developers could do things like assign silly hats to multiple faces in a video feed. While you may have seen face detection on some Android devices, those were all solutions built by Android OEMs.

There's support for burst mode, too—another feature that you would swear was already included in Android but isn't. On Nexus devices, the only "burst mode" involves the user pressing the shutter button really fast.

The camera device is removable and has been disconnected from the Android device, or the camera service has shut down the connection due to a higher-priority access request for the camera device.

The strangest new feature is probably support for a removable camera. We can't recall a single Android device of any kind that has had a removable camera, so feel free to leave your suggestions in the comments.

The most important possible improvement that wouldn't be visible in the source code: image quality. Android cameras arguably lag behind the iPhone in quality, so this new API may be Google's solution to that problem. Android's subpar image quality seems to be an across-the-board problem, so maybe the issue really is as low-level as the camera API. There's no way to be sure, though, until we get finished software and devices in our hands. With documentation using phrases like "substantially improved capabilities" and "fine-grain control" it certainly sounds like Google is out to fix Android's digital-imaging woes.