There are three ugly things and two beautiful things about this picture. The ugly things: (1) I have wrecked a lot of the detail visible in the original image, saturating some of the sunlit surfaces so that they are completely white. This isn't a reversible change; if I darkened it again, all those pixels would still be the same value. I couldn't get the lost detail back. (2) I have amplified the blemishes, the hot pixels and cosmic ray hits that speckle the photo. (3) I have amplified the rhythmic linear horizontal banding that appears at a low level in all Cassini photos, an artifact having to do with some interference from spacecraft electronics. The beautiful things: (1) you can see the geysers. (2) You can see Enceladus' full globe just barely picked out against the background, Enceladus' night side appearing darker than whatever is behind it.

It's kind of amazing that either one of the two beautiful things is visible in the data. We can thank the 16-bit format. So now it's time for me to explain what that means. Why 16 bits? When you look at black-and-white images on the Internet, you are almost always looking at an 8-bit photo. That means that each pixel's brightness is encoded with an 8-digit binary number, so the value of the pixel can range from ranging from 0 (black) to 2^8-1 or 255 (white). That's more gray values than the human eye can hope to distinguish all at once, but it's not enough to represent the full possible range of brightness and darkness in our experience. Our eyes can see well both in bright daylight and in dimly lit buildings, where the light is a factor of 1000 or more weaker. But we can't see in both kinds of lighting at once. Our eyes adjust to bright or to dim light, but when we're outdoors in bright sunlight it's hard for us to see what's going on through a window into a dimly lit building.

Many cameras can handle a wider contrast range at once than our eyes can. But that means they need more than 8 bits to record the different brightnesses and darknesses in a scene. In order to record things that are 1000 times dimmer than other things, you need to digitize the view on a scale that has more than 1000 divisions. It's common for spacecraft cameras these days to measure the universe using 12 or more data bits, meaning they can handle more than 4000 different gray levels. The operating systems on the computers that we use back on Earth like to deal in groups of 8 bits at a time, so if you want to deal with 12-bit data, you use instead a 16-bit image format.

So the information is there; but how to represent it in a way that the more limited human eye can discern? A lot of the time it's possible to stretch the contrast in a nonlinear fashion so that you can bring out details at both ends of the brightness range. But that just wasn't working for me here; the geysers were just too dim. Any attempt to bring them out washed out the detail visible on the sunlit crescent and rings, and also made that horizontal banding too obvious. I was going to have to treat the moon and the plumes separately.

First of all, I needed to clean up the low-contrast data. I had to do something to reduce the effects of the horizontal banding and the cosmic ray hits. Here is a really neat trick to remove the horizontal banding. I'm using Photoshop but if you don't have that, you can use the same technique in GIMP. The banding is visible in the space behind Enceladus. If I can make a layer that has just the bands, but no moons or anything else, I can subtract the values of those pixels from the original photo, canceling out the banding.

First I have to find a region in the background that doesn't have any major blemishes. I've outlined such a region below.