Perhaps you once took a university course in Operating Systems. Or you think you did.

In reality, the course catalog ought to have read: "Dark Age Software Archaeology: A UNIX Case Study."

But do we still call it archaeology if people are still building pyramids? Moreover, if architects continue to push the pyramid as the pinnacle of architecture?

Before the advent of structural steel, the pyramid was the only form of building which could exceed a height of around five stories. Now imagine that the building industry had simply ignored all advances in metallurgy. This is precisely what happened in computing: CPU architectures with built-in array bounds and type checking obsolete the entire computer security field as it now exists, in just the same way modern medicine obsoletes bloodletting - which is one reason why we are denied them.

Likewise, my dear readers, some of you may recall attending lectures entitled "Human-Computer Interaction," or "User Interfaces." In fact, the course should have been called "WIMP: Twentieth-Century Computing and the Cult of the Novice." [1] [2]

The list could go on: I'm probably not the only person who took a Theory of Computation class which taught the Turing Machine yet breathed not a word about the Lambda Calculus.

[1] Erik Naggum : "The Novice has been the focus of an alarming amount of attention in the computer field. It is not just that the preferred user is unskilled, it is that the whole field in its application rewards novices and punishes experts. What you learn today will be useless a few years hence, so why bother to study and know anything well?"

[2] Think there's been substantial progress in the GUI since 1981? Think again.