Have you ever looked at the static that appears on a TV screen when it isn't tuned into anything? If so, you might have noticed that it's in black and white. That fact always used to puzzle me - the patterns are random so surely all the colours that can appear on the TV should be equally likely, right?

It wasn't until fairly recently that I learned why this is. Colour TV signals are a little bit different than black-and-white TV signals - a certain frequency band within the signal is used to transmit colour information. That band corresponds to high frequency horizontal detail (patterns about 1/200th of the width of the screen). In a colour TV signal, those details are elided and the information used to carry hue and saturation information instead.

However, if you're watching a black and white programme you can get a sharper picture by using those frequencies for horizontal detail. So colour TV sets were designed to have two "modes" - colour mode and black-and-white mode. A "colour burst" signal is broadcast in an otherwise unused part of the signal which has the dual purposes of signalling that colour information is available, and calibrating the correct hue phase offset (the "colour burst" signal, if it were on screen and within gamut, would be a very dark olive green colour). This signal has to be present for about half a field before the TV will switch to colour mode. This is an imperceptably short time but stops the TV flickering in and out of colour mode if the signal is marginal.

Having a signal of the correct frequency at the correct time for that period of time is extremely unlikely to occur by chance (and even if it did, it would disappear again before you had the chance to notice it). So when the TV is showing static, it thinks it's showing an old black-and-white movie and turns off the colour interpretation circuitry, leading to black-and-white static.