Gamut is one of the stranger words in the English language. Phil Rhodes sheds some light on this often misunderstood term

Recently we've talked about colour quite a bit, in terms of the characteristics of the lighting that illuminates objects, and the abilities of cameras and displays to record and reproduce those colours. This is particularly relevant right now, because the newly-announced Sony F55, like the F65, has a sensor designed to include a somewhat wider range of colours than previous electronic systems.

Film has a wide range of available colours

This is important for cinema applications because, quite apart from whatever else is wrong with traditional video cameras, filmmakers accustomed to photochemical emulsions are used to having a very wide range of colours available. Given an appropriately-tinted piece of film, a film projector is theoretically capable of projecting any colour which is emitted by its light source. A typical xenon lamp comfortably covers the entire range of human vision and some way beyond, and the film is free to filter that light to any desired colour. Practical film manufacturing limitations limit the range that's actually available, but it remains true that the colour performance of a film camera and projection system is somewhat user-definable.

Conversely, the display you're staring at as you read this – which is almost certainly a TFT liquid-crystal or possibly OLED – has colour performance which is very much set at manufacture. The backlight of a TFT display may have reasonable coverage of the entire human visual range, but it is tinted by filters which are silk-screened onto the back of the display in the factory. Once you've turned the blue pixels on, for instance, and the red and green pixels completely off, that's as blue as that display can ever get. The only way to achieve deeper colour is to put a filter over the front of the display, and you do not have the flexibility that film can give you.





Colour Gamut

What we're discussing here is properly referred to as colour gamut, and it's widely misunderstood. RGB and YUV signals, for instance, are two ways of encoding colour which exist independently of gamut, which instead depends on which red, which green, and which blue (or which blue-yellow U channel, and which red-cyan V channel) is used. You cannot have a greener green than the green dye in a TFT. You can never achieve a redder red than the red phosphor in a CRT – although redder reds and greener greens might be entirely within the human visual range.

A colour gamut is usually depicted as a region on a chromaticity diagram according to CIE 1931, which sounds complicated but is in fact nothing more than a diagram showing all of the colours that a human eye can see, with a shaded area representing the colours your system is capable of displaying. Of course, since you're looking at this article on a display that is most assuredly not capable of displaying all of the colours your eye can see, it's impossible to represent it accurately, but it looks something like this:

Notice that the CIE 1931 diagram does not deal with brightness; it deals solely with colour, from white somewhere near the middle, via increasing saturation, to fully-saturated primary colours at the outside. The curved edge of the horseshoe-shaped area represents strictly monochromatic colours – light of exactly one wavelength. The applications of this are enormous, but the basic concept is easy to understand. For a start, you can pick any two points on it, and all of the colours on a line between those two points can be created by mixing the colours at the two points. By defining three points (such as a particular red, green and blue), by mixing those three we can achieve any of the colours inside the resulting triangle. Recently, display manufacturers have begun to discuss displays using more than three primaries, and similarly, plotting shapes with four or five corners allows us to create any of the colours inside the shape; those colours are in gamut. Anything outside the shape is out of gamut, and will be displayed as the nearest in-gamut equivalent.

Given all this, it's pretty obvious what happens if we plot, say, the red, green and blue filters of a film stock or electronic display on the CIE diagram. The greener the green, the redder the red and the bluer the blue, the larger the triangle is, and the more comprehensive our coverage is of what the eye can see. The actual primary colours used for high-definition TV work are defined in the International Telecommunication Union's Radio Communication Sector, recommendation 709, the commonly-encountered “Rec. 709” standard.





Saturation

You can choose to display less saturated colours by bleeding (for instance) some red and green light into your blue, and for that reason you'd have thought that the best technical approach would be to pick particularly deep, saturated red, green and blue filters for a TFT display, or organic materials that provoked particularly deep and saturated colours for the active components of your OLED panel. Unfortunately, this hasn't turned out to be entirely the case: the primaries in Rec. 709 were largely defined in the days of phosphor-based cathode ray tubes, which in turn were limited by the ability of scientists to produce phosphors that glowed with the desired colours. Phosphor reds, for instance, tend to be orangeish, whereas greens can be yellowish. Old-style CRT video projectors sometimes used magenta-tinted filters on the red tube and cyan ones on the green to remove these contaminating hues and improve colour reproduction. As a model for modern displays, phosphor-based colours leave a lot to be desired, meaning that common computer displays fail to reproduce the deep greens and green-blues of nature. The deep turquoise of sunlit water around a tropical reef is a common example of something many modern displays just can't achieve.

It's not just displays; it's cameras as well

All of this applies to cameras just as well as to displays. The filters of a colour sensor can limit the ability of a camera to record a colour just as much as the display's filters limit its ability to display it, and that's where things like Sony’s F65 and F55 come in. Both use sensors (not the same sensor) which have deeper primary colours than Rec. 709 and suffer far less of these problems. This is great for digital cinema, which uses a colour system differing entirely from common RGB representations, and it also helps with selectivity and range in colour grading.

But there is a problem. Much as it's great to have access to extra colour information throughout post, and in specialist display situations such as digital cinema, the majority of material is viewed in the home, on televisions and computer monitors, of which there are millions in the world, which have, at best, an approximation of Rec. 709 colour primaries. Replacing cameras is easy, and happens regularly, but upgrading the lounge TVs of the world to a better experience of cinematic colour is, shall we say, going to be a bit of a project.