The first ever images taken from the surface of the far side of the moon have been released following the Chinese National Space Administration’s (CNSA) successful landing there. The lander Chang'e 4 and rover Yutu 2 follow from Chang'e 3 and the original Yutu rover that were deployed on the moon’s near side in 2013.

But if you’ve been looking closely at the pictures, you could be forgiven for thinking that the far side of the moon is red. That’s how it looks on the unprocessed pictures – and it’s different from other pictures of the moon, in which it appears grey. So what is going on?

Cameras on spacecraft often don’t see colours in the same way as the human eye. For example, the red, green and blue components are usually recorded separately. This was the case with the latest images, and no colour correction has been applied to take account of the different sensitivities of each set of the camera’s colour detectors.

The first picture below is an example of one such “raw” image, and I’ve accompanied it with histograms of the red, green and blue channels to show how brightness is distributed in each. In the raw version, the lunar surface looks red because the detectors used were more sensitive to red than they were to blue or green. So although in truth the surface is almost equally bright in all three colours, the green and blue detectors have been set up so as to be less sensitive to light than the red detectors. This is why the green and blue histograms do not extend to the bright end of the range of their scale.