Two decades ago, old VCRs were in disproportionately high demand. Newer ones were unable to copy movies as they were distorted by a special signal. Hollywood is fighting for this war on equipment owners to carry over to general-purpose computers. Will they succeed?

When we copied movies two decades ago, the very oldest video cassette recorders (VCRs) were in high demand.

If you’re wondering just how old these VCRs were, think “coal-powered”. The reason was a curious mechanism preventing the copying of movies, and to understand it, we need to look a little bit into how the technology worked in that era.

Images on a TV were displayed using a single electron beam racing very fast across a chemically coated screen that produced color when hit by the beam. It raced first from left to right, and in horizontal lines, from top to bottom, painting an image on the fluorescent coating. Then, the beam returned back to top-left and started painting the next image in the movie. The beam painted about 50 such images a second, and the act of returning back to top-left is called a “vertical retrace”. You still see that term in some games, as game screen updates took place during the vertical retrace to avoid flickering.

When VCRs were recording movies to tape, they would try to adapt the recording of the magnetic signal to the strength of the incoming electric signal, just like most microphones do today. Some Hollywood pundits discovered, that if they threw in insanely high signal levels in the short timespan where vertical retraces happened in the movie, the TVs would not care, but VCRs would be unable to record anything at all, thrown completely off by the disruptive decoy signals. This scheme was called “Macrovision”; whether that was the company, the name of the product, or something else is not important.

Thus, Hollywood had created a technical ecosystem where you could just play their media, but not re-record it. In this day and age, very old VCRs – those before the adaptive input arrived – were in high demand, as they wouldn’t care about the decoy signals but record everything received verbatim, just like the TV would display everything verbatim. An old VCR was required to copy movies.

It is obvious that Hollywood and their ilk is trying to repeat this trick on the general-purpose computer: attempting to put it under their control by evermore complex Digital Restriction Mechanisms. But unlike the VCR, where everything was in hardware, the owner of the general-purpose computer gets to choose how signals are interpreted on their own computer, and can instruct it to disregard anything they don’t like. Therefore, today’s version of Macrovision is a joke.

Enter so-called “trusted computing”, which is Orwellspeak for “untrustable computer owners”. There has been a gradual push for motherboards that refuse to bootstrap any operating system other than pre-approved ones, creating a chain of locking out the owner of the equipment from the ability to run any code they like. So far, this has always been hogwash and its “security” as brittle as nail clippings in an industrial shredder, showing the incompetence of Hollywood in a world alien to them – but still, the movement is there.

The most worrying push to date is that Microsoft requires computer motherboards to have that kind of lockdown to get the certification for Windows 8. They go even further on boards with ARM-based architecture: on such devices, Microsoft requires that you can’t even change what operating systems are allowed to run.

Let me say that once again: hardware is now being sold that doesn’t allow the owner to run any code they like on it. We’ve had software trying that kind of trick for a long time (and it’s ridiculously easy to circumvent in most cases), but hardware disallowing that is a new trick from the copyright industry.

I run Ubuntu on all my systems, which is a flavor of GNU/Linux. I like that. I like the principle that property rights extend to me running any code I want on my own hardware. It feels basic and natural, not to say obvious and unquestionable.

It used to be that the question “Does it run Linux?” referred to ability, as in, asking whether the hardware was physically capable of running GNU/Linux.

We may come to enter an era where the question instead refers to Digital Restriction Mechanisms, as in, is the hardware locked out from running operating systems such as GNU/Linux that actually honor property rights?

Or do we have to go the same way as we did in the Macrovision era, where old hardware had a premium value for innovation and use because it wasn’t bogged down by copyright industry bullshit?