Enrolments in computer science degrees (in the US and Western Europe) have fallen drastically over the past few years, although I believe there has now been an increase in applicants from the low point in 2008. Part of this was a reaction to the 2001 dot-com crash but I believe that another factor is that many if not most computer science degrees are increasingly irrelevant as far as real-world computing is concerned.

There are two major areas of real-world computing that, IMHO, don’t get enough attention in computer science degrees. One is embedded, real-time systems. We are surrounded by such systems – I counted 9 in my home office alone apart from the computers (router, scanner, printer, 3 phones, 1 ipod, 2 cameras) yet some CS degrees simply don’t cover them at all and the majority probably only offer a single optional module. The other area is my own interest of large-scale systems engineering. Current large-scale systems are not built from scratch using waterfall processes but are constructed by reusing software at different granularities from program libraries to entire systems. Requirements are negotiated and issues such as throughput, dependability and security are critical. Most courses do include some software engineering – an essential starting point for considering large-scale systems engineering but then they stop. They don’t include courses on topics such as dependability and enterprise systems architectures.

Why have we got ourselves into this state where CS degrees are about programming rather than systems? The argument I hear from my colleagues is that programming is fundamental (true) and learning to program in Java (or perhaps a functional language) and to analyse programs for e.g. complexity is all you need. This, I think, simply is a weak justification of an untenable position – that systems are just scaled up programs. We have arrived at this state because of two basic problems with our university system:

1. It’s very hard for people in universities to keep up with changes in the world. People who were appointed 20 years ago have spent their time scaling the academic career ladder and many of them have had little or no contact with industry since they were appointed. Essentially, they have no idea what real-world computing is like.

2. People’s career in universities depends on their research. As researchers are clever people, they have derived lots of ways to publish research and so support career development. The community itself assesses the value of the research and relevance to industry is rarely a factor in this assessment. In fact, its often harder to publish work done in industry, even in software engineering, because it doesn’t tick all the academic boxes of what constitutes a good paper. Therefore, people do ‘excellent’ research in irrelevant things. Often, they are committed teachers and the research informs their teaching – but they don’t want sordid reality to intrude and actually have to teach something useful.

Of course, both of the above reasons are (slightly) exaggerated presentations of reality but the blunt reality is that the reason why universities don’t teach real-world computing is that many of the faculty just couldn’t do it. There may be intellectual value in a programming-oriented CS degree (as there is in a degree in Classics or Philosophy) but I believe that we must also design our degrees so that students are better equipped to make a contribution to computing as it is, rather than computing as it was.