Watch various contemporaries, from shining CS greats to obscure Usenet personae, alive and dead, strike the evil in programming languages, operating systems, user interfaces, computing in general, opinions held about these matters, and elsewhere.

[Authors or copyright holders who want their piece of intellectual property removed from here may please contact the webmaster.]

The basis of computer languages' merit lies in their mathematical properties. It is this metric, that we should use as a guide for direction. As an analogy, we measure the quality of a hammer by scientific principles: ergonomics, material (weight, hardness. . .), construction, statistical analysis of accidents/productivity/… … etc. , not by vogue or lore. If we go by feelings and preferences, hammer's future will deviate and perhaps become dildos or maces. Xah Lee in comp.lang.lisp, July 2000

*

Since then we have witnessed the proliferation of baroque, ill-defined and, therefore, unstable software systems. Instead of working with a formal tool, which their task requires, many programmers now live in a limbo of folklore, in a vague and slippery world, in which they are never quite sure what the system will do to their programs. Under such regretful circumstances the whole notion of a correct program —let alone a program that has been proved correct— becomes void. What the proliferation of such systems has done to the morale of the computing community is more than I can describe. . It will certainly leave all those dissatisfied who identify the difficulty of programming with the difficulty of cunning exploitation of the elaborate and baroque tools known as “higher level programming languages” or —worse!— “programming systems”. When they feel cheated because I just ignore all those bells and whistles, I can only answer: “Are you quite sure that all those bells and whistles, all those wonderful facilities of your so-called `powerful' programming languages belong to the solution set rather than to the problem set?” . There exist, regretfully enough, machines in which the continuous check that the simulation of the behaviour of the UM (Unbounded Machine) is not beyond their capacity is so time-consuming, that this check is suppressed for the supposed sake of efficiency: whenever the capacity would be exceeded by a correct execution, they just continue —for the supposed sake of convenience— incorrectly. It is very difficult to use such a machine as a reliable tool, for the justification of our belief in the correctness of our answers produced requires in addition to the proof of the program's correctness a proof that the computation is not beyond the capacity of the machine, and, compared to the first one, this second proof is a rather formidable obligation. Edsger W. Dijkstra: A discipline of programming. Prentice Hall, Englewood Cliffs NJ, 1976.

*

I absolutely fail to see how we can keep our growing programs firmly within our intellectual grip when by its sheer baroqueness the programming language –our basic tool, mind you!– already escapes our intellectual control. . We all know that the only mental tool by means of which a very finite piece of reasoning can cover a myriad cases is called “abstraction”; as a result the effective exploitation of his powers of abstraction must be regarded as one of the most vital activities of a competent programmer. In this connection it might be worth-while to point out that the purpose of abstracting is not to be vague, but to create a new semantic level in which one can be absolutely precise. . The analysis of the influence that programming languages have on the thinking habits of its users, and the recognition that, by now, brainpower is by far our scarcest resource, they together give us a new collection of yardsticks for comparing the relative merits of various programming languages. The competent programmer is fully aware of the strictly limited size of his own skull; therefore he approaches the programming task in full humility, and among other things he avoids clever tricks like the plague. . Another lesson we should have learned from the recent past is that the development of “richer” or “more powerful” programming languages was a mistake in the sense that these baroque monstrosities, these conglomerates of idiosyncrasies, are really unmanageable, both mechanically and mentally. I see a great future for very systematic and very modest programming languages. (...) Finally, in one respect one hopes that tomorrow's programming languages will differ greatly from what we are used to now: to a much greater extent than hitherto they should invite us to reflect in the structure of what we write down all abstractions needed to cope conceptually with the complexity of what we are designing. . LISP has jokingly been described as “the most intelligent way to misuse a computer”. I think that description a great compliment because it transmits the full flavour of liberation: it has assisted a number of our most gifted fellow humans in thinking previously impossible thoughts. . I remember from a symposium on higher level programming language a lecture given in defense of PL/1 by a man who described himself as one of its devoted users. But within a one-hour lecture in praise of PL/1 he managed to ask for the addition of about fifty new “features”, little supposing that the main source of his problems could very well be that it contained already far too many “features”. The speaker displayed all the depressing symptoms of addiction, reduced as he was to the state of mental stagnation in which he could only ask for more, more, more... Edsger W. Dijkstra: The humble programmer (Turing award lecture, EWD 340)

*

Programming languages should be designed not by piling feature on top of feature, but by removing the weaknesses and restrictions that make the additional features appear necessary. R5RS

*

La machine elle-même, plus elle se perfectionne, plus elle s'efface derrière son rôle. Il semble que tout l'effort industriel de l'homme, tous ses calculs, toutes ses nuits de veille sur les épures, n'aboutissent, comme signes visibles, qu'à la seule simplicité, comme s'il fallait l'expérience de plusieurs générations pour dégager peu à peu la courbe d'une colonne, d'une carène, ou d'un fuselage d'avion, jusqu'à leur rendre la pureté élémentaire de la courbe d'un sein ou d'une épaule. Il semble que le travail des ingénieurs, des dessinateurs, des calculateurs du bureau d'études ne soit ainsi en apparence, que de polir et d'effacer, d'alléger ce raccord, d'équilibrer cette aile, jusqu'à ce qu'on ne la remarque plus, jusqu'à ce qu'il n'y ait plus une aile accrochée à un fuselage, mais une forme parfaitement épanouie, enfin dégagée de sa gangue, une sorte d'ensemble spontané, mystérieusement lié, et de la même qualité que celle du poème. Il semble que la perfection soit atteinte non quand il n'y a plus rien à ajouter, mais quand il n'y a plus rien à retrancher. Au terme de son évolution, la machine se dissimule. La perfection de l'invention confine ainsi à l'absence d'invention. Et, de même que, dans l'instrument, toute mécanique apparente s'est peu à peu effacée, et qu'il nous est livré un objet aussi naturel qu'un galet poli par la mer, il est également admirable que, dans son usage même, la machine peu à peu se fasse oublier. Antoine de Saint-Exupéry : Terre des hommes › L'avion

*

I will contend that conceptual integrity is the most important consideration in system design. It is better to have a system omit certain anomalous features and improvements, but to reflect one set of design ideas, than to have one that contains many good but independent and uncoordinated ideas. . For a given level of function, however, that system is best in which one can specify things with the most simplicity and straightforwardness.Simplicity is not enough. Mooers's TRAC language and Algol 68 achieve simplicity as measured by the number of distinct elementary concepts. They are not, however, straightforward. The expression of the things one wants to do often requires involuted and unexpected combinations of the basic facilities. It is not enough to learn the elements and rules of combination; one must also learn the idiomatic usage, a whole lore of how the elements are combined in practice. Simplicity and straighforwardness proceed from conceptual integrity. Every part must reflect the same philosophies and the same balancing of desiderata. Every part must even use the same techniques in syntax and analogous notions in semantics. Ease of use, then, dictates unity of design, conceptual integrity. Frederick P. Brooks, Jr.: The Mythical Man-Month. Addison-Wesley, Reading MA , 1995 (anniversary ed.)

*

Simplicity of the language is not what matters, but simplicity of use. Richard A. O'Keefe in squeak-dev mailing list, April 2003

*

For some reason, «committee product» is a bad thing in the eyes of many software people, who flock to the products of single-minded single minds, instead. For some reason, following a single leader and adopting a uniform set of ideas and ideals from one person is preferable to adopting multifarious ideas and ideals from a group of people who cannot seem to agree on anything, but who actually agree on so much that the things they disagree about are important. Erik Naggum in comp.lang.lisp, January 2004

*

Software development methodologies evolved under this regime [figure it out, code it up, compile it, run it, throw it away] along with a mythical belief in master planning. Such beliefs were rooted in an elementary-school-level fiction that great masterpieces were planned, or arose as a by-product of physicists shovelling menial and rote coding tasks to their inferiors in the computing department. Master planning feeds off the desire for order, a desire born of our fear of failure, our fear of death. Richard P. Gabriel and Ron Goldman: Mob Software: The Erotic Life of Code

*

To me, development consists of two processes that feed each other. First, you figure out what you want the computer to do. Then, you instruct the computer to do it. Trying to write those instructions inevitably changes what you want the computer to do and so it goes. In this model, coding isn't the poor handmaiden of design or analysis. Coding is where your fuzzy, comfortable ideas awaken in the harsh domain of reality. It is where you learn what your computer can do. If you stop coding, you stop learning. We aren't always good at guessing where responsibilities should go. Coding is where our design guesses are tested. Being prepared to be flexible about making design changes during coding results in programs that get better and better over time. Insisting that early design ideas be carried through is short sighted. Kent Beck: Smalltalk Best Practice Patterns. Prentice Hall, NJ 1997

*

Planning is a necessary evil. It is a response to risk: the more dangerous an undertaking, the more important it is to plan ahead. Powerful tools decrease risk, and so decrease the need for planning. The design of your program can then benefit from what is probably the most useful source of information available: the experience of implementing it. . The spirit of Lisp hacking can be expressed in two sentences. Programming should be fun. Programs should be beautiful. Paul Graham: ANSI Common Lisp

*

If I may intrude a personal element here, one of the things which distinguishes imperative programming in C, Pascal, Fortran or whatever from declarative programming in Prolog, Scheme, ML or whatever for me is a big difference in feeling. When I code in C, I feel I'm on a knife-edge of “state” — I focus on statements and what they do. I'm worried about the behaviour of a machine. But when I'm writing Prolog, the predicates feel like geometric objects and the data flow between goals feels like lines of tension holding the goals together into an integrated whole, as if the program fragment I was working were a large Rubik's cube that I could handle and move from one configuration to another without destroying it. When I fix mistakes in a Prolog program, I look for flaws in the “spatial” configuration of the program; a mistake feels like a snapped thread in a cobweb, and I feel regret for wounding the form. When I'm coding C, I worry about `register' declarations and pointer arithmetic. When I'm coding Prolog, I worry about getting the interface of each predicate just right so that it means something and has the visible perfection of a new leaf. Richard A. O'Keefe: The Craft of Prolog. MIT Press, Cambridge MA, 1990

*

First, we want to establish the idea that a computer language is not just a way of getting a computer to perform operations but rather that it is a novel formal medium for expressing ideas about methodology. Thus, programs must be written for people to read, and only incidentally for machines to execute. Hal Abelson and Gerald Jay Sussman with Julie Sussman prefacing SICP

*

*

A language that doesn't affect the way you think about programming, is not worth knowing. . A programming language is low level when its programs require attention to the irrelevant. . The string is a stark data structure and everywhere it is passed there is much duplication of process. It is a perfect vehicle for hiding information. . Syntactic sugar causes cancer of the semicolon. Alan J. Perlis: Epigrams in Programming

*

languages shape the way we think, or don't. Erik Naggum in comp.lang.lisp, January 2000

*

I often feel that the American programmer would profit more from learning, say, Latin than from learning yet another programming language. Edsger W. Dijkstra: On the fact that the Atlantic Ocean has two sides ( EWD 611)

*

Besides a mathematical inclination, an exceptionally good mastery of one's native tongue is the most vital asset of a competent programmer. Edsger W. Dijkstra: How do we tell truths that might hurt? ( EWD 498)

*

Strachey's first law of programming: Decide what you want to say before you worry about how you are going to say it. Dana S. Scott: foreword to Joseph E. Stoy's Denotational Semantics: The Scott-Strachey Approach to Programming Language Theory. MIT Press, Cambridge MA, 1977

*

By trying to turn our explanations and theories into designs for working systems, we soon discover their poverty. The computer, unlike academic colleagues, is not convinced by fine prose, impressive looking diagrams or jargon, or even mathematical equations. If your theory doesn't work then the behaviour of the system you have designed will soon reveal the need for improvement. Often errors in your design will prevent it behaving at all. Books don't behave. We have long needed a medium for expressing theories about behaving systems. Now we have one, and a few years of programming explorations can resolve or clarify some issues which have survived centuries of disputation. Aaron Sloman: The Computer Revolution in Philosophy

*

It's the nature of programming languages to make most people satisfied with whatever they currently use. Computer hardware changes so much faster than personal habits that programming practice is usually ten to twenty years behind the processor. At places like MIT they were writing programs in high-level languages in the early 1960s, but many companies continued to write code in machine language well into the 1980s. I bet a lot of people continued to write machine language until the processor, like a bartender eager to close up and go home, finally kicked them out by switching to a risc instruction set. . Ordinarily technology changes fast. But programming languages are different: programming languages are not just technology, but what programmers think in. They're half technology and half religion. And so the median language, meaning whatever language the median programmer uses, moves as slow as an iceberg. . comparisons of programming languages either take the form of religious wars or undergraduate textbooks so determinedly neutral that they're really works of anthropology. People who value their peace, or want tenure, avoid the topic. But the question is only half a religious one; there is something there worth studying, especially if you want to design new languages. Paul Graham: Beating the Averages

*

languages are ecologies. Language features are not a priori good or bad. Rather, language features are good or bad in context, based on how well they interact with other language features. . What do I consider is most important for an abstract language to support efficiently? My time. Time is the only true, non-renewable commodity. I eschew languages like C because they often waste enormous amounts of my time trying to develop and debug programs, and justify it on the basis of micro-differences in speed that have just never ended up mattering to me. I regard C as appropriate for use as an assembly language, but it doesn't provide enough high-level services for me. When I'm old and grey and look back on my life, I want to have done a lot of interesting things, not just have done a few interesting things but “boy were they fast”. . I want my ideas to lead my technology and my tools, not to have my technology and tools leading my ideas. . I also view the process of programming as a series of “times”at which decisions can be made: “coding time,” “parsing time” (Lisp calls this “read time”), “macro expansion time,” “compilation time,” “load time,” and “execution time.” Lisp gives me a great deal more control for each piece of code as to when it runs, so that it can run at the appropriate time when the data it depends on is known. Other languages, especially statically typed ones, often make me specify information too soon, before it is really known, which usually means “making up” answers instead of really knowing the answers. Sometimes that makes programs run faster. Sometimes it just makes them run wrong. Kent Pitman – Answers on Lisp and Much More, Slashdot 2001

*

remember that language design and language implementation are different tasks. When you are designing a new language feature first think like a user and then validate it as an implementor. Allen Wirfs-Brock in squeak-dev mailing list, January 2003

*

Each and every level of "containership" in SGML has its own syntax, optimized for the task. Each and every level has a different syntax for "the writing on the box" as opposed to "the contents of the box". This follows from a very simple, yet amazingly elusive principle in its design: Meta-data is conceptually incompatible with data. This is in fact wrong. Meta-data is only data viewed from a different angle, and vice versa. SGML forces you to remain loyal to your chosen angle of view. . My goal is to get rid of the idea that there is a distinction that can be made once and for all, and prematurely at that, that some information is meta-data and some information is data. The core philosophical mistake in SGML is that you can specify these things before you know them. SGML is great for after-the-fact description of structures you already know how to deal with perfectly. It absolutely sucks for structures that are in any way yet to be defined. This is because it is impossible to define what is considered meta-information and what is considered information before you actually have a full-blown software application that is hard to change your mind about. SGML was supposedly designed to free data from the vagaries of software, but when it adopted the attribute-content dichotomy, it dove right into dependency on the software design process instead of the information design process. . Just like Plato and Aristotle agreed that ideas and concepts were somehow "inherent" in the things we saw and not a property of the person who observed and organized them in his own mind, SGML embodies the false premise that structuring has some inherent qualities and processing that structure should reflect its inherent qualities. The result is that the processing defines the structure. If there is a mismatch between the two, the result is a very painful and elaborate processing, Erik Naggum in comp.lang.lisp, August 2001

*

It's like driving a tank: if you drift off the road, any telephone poles you might knock down are only proof positive of your mighty tank's ability to get where you want to go, and the important psychological “corrector” that hindrances should have been is purposefully ignored because you are too powerful. Erik Naggum in comp.lang.lisp, June 1999

*

The computers are never large enough or fast enough. Each breakthrough in hardware technology leads to more massive programming enterprises, new organizational principles, and an enrichment of abstract models. Every reader should ask himself periodically “Toward what end, toward what end?” — but do not ask it too often lest you pass up the fun of programming for the constipation of bittersweet philosophy. Alan J. Perlis forewording SICP

*

1. I have done my share of semantics.

2. I am doing my share of systems building.

3. And I really wish that I could say 1 and 2 are related. Matthias Felleisen in types mailing-list, November 2002

the computer, by virtue of its fantastic speed, seems to be the first [technology] to provide us with an environment where highly hierarchical artefacts are both possible and necessary. This challenge, viz. the confrontation with the programming task, is so unique that this novel experience can teach us a lot about ourselves. It should deepen our understanding of the processes of design and creation, it should give us better control over the task of organizing our thoughts. If it did not do so, to my taste we should not deserve the computer at all! Edsger W. Dijkstra: The humble programmer (Turing award lecture, EWD 340)

*

Computing's core challenge is how not to make a mess of it. . prevention is better than cure, in particular if the illness is unmastered complexity, for which no cure exists. . It is time to unmask the computing community as a Secret Society for the Creation and Preservation of Artificial Complexity. Edsger W. Dijkstra: The next forty years ( EWD 1051)

*

in the practice of computing, where we have so much latitude for making a mess of it, mathematical elegance is not a dispensable luxury, but a matter of life and death. Edsger W. Dijkstra: My Hopes of Computing Science ( EWD 709)

*

*

Im Interesse der Deutlichkeit erschien es mir unvermeidlich, mich oft zu wiederholen, ohne auf die Eleganz der Darstellung die geringste Rücksicht zu nehmen; ich hielt mich gewissenhaft an die Vorschrift des genialen Theoretikers L. Boltzmann, man solle die Eleganz Sache der Schneider und Schuster sein lassen. Albert Einstein: Über die spezielle und die allgemeine Relativitätstheorie

*

Software quality has almost nothing to do with algorithmic elegance, compactness, or speed — in fact, those attributes do more harm to quality than good. . The objective is to make things as clear as possible to the designer and to yourself, and excessive formality can destroy clarity just as easily as modest formality can enhance it. . [The Spec] Be literal in your interpretation and smile when the designer accuses you of semantic nit-picking. . In programming, it's often the buts in the specification that kill you. . Don't squeeze the code. Don't squeeze the code. DON'T SQUEEZE THE CODE. . Like so much in testing, the act of getting the information on which to base tests can be more effective at catching and exterminating bugs than the tests that result from that information. Insisting on getting transaction flows or the equivalent is sometimes a gentle way of convincing inept design groups that they don't know what they're doing. These are harsh words, but let's face it: superb code and unit testing will be useless if the overall design is poor. And how can there be a rational, effective design if no one on the design team can walk you through the more important transactions, step by step and alternative by alternative. I'm sure that mine is a biased sample, but every system I've ever seen that was in serious trouble had no transaction flows documented, nor had the designers provided anything that approximated that kind of functional representation; however, it's certainly possible to have a bad design even with transaction flows. Boris Beizer: Software Testing Techniques 2E. Van Nostrand Reinhold, New York 1990

*

the designer may attempt to cover him or herself by specifying a more complicated, and more general solution to certain problems, secure in the knowledge that others will bear the burden of constructing these artifacts. When such predictions about where complexity is needed are correct, they can indeed be a source of power and satisfaction. This is part of their allure of Venustas. However, sometime the anticipated contingencies never arise, and the designer and implementers wind up having wasted effort solving a problem that no one has ever actually had. Other times, not only is the anticipated problem never encountered, its solution introduces complexity in a part of the system that turns out to need to evolve in another direction. In such cases, speculative complexity can be an unnecessary obstacle to subsequent adaptation. It is ironic that the impulse towards elegance can be an unintended source of complexity and clutter instead. Brian Foote and Joseph Yoder: Big Ball of Mud

*

Paradox: by not considering the future of your code, you make your code much more likely to be adaptable in the future. Kent Beck: Test-Driven Development. Addison-Wesley, Boston 2002

*

Not getting lost in the complexities of our own making and preferably reaching that goal by learning how to avoid the introduction of those complexities in the first place, that is the key challenge computing science has to meat. . Nowadays machines are so fast and stores are so huge that in a very true sense the computations we can evoke defy our imagination. Machine capacities now give us room galore for making a mess of it. Opportunities unlimited for fouling things up! Developing the austere intellectual discipline of keeping things sufficiently simple is in this environment a formidable challenge, both technically and educationally. . The sore truth is that complexity sells better. (It is not only the computer industry that has discovered that.) And it is even more diabolical in that we even use the complexity of our own constructs to impress ourselves. . It is a genuine sacrifice to part from one's ingenuities, no matter how contorted. Also, many a programmer derives a major part of his professional excitement from not quite understanding what he is doing, from the daring risks he takes and from the struggle to find the bugs he should not have introduced in the first place. Edsger W. Dijkstra: The threats to computer science ( EWD 898)

*

The complexity of software is an essential property, not an accidental one. Hence descriptions of a software entity that abstract away its complexity often abstract away its essence. Frederick P. Brooks, Jr.: No Silver Bullet — Essence and Accident in Software Engineering. In: The Mythical Man-Month (anniversary ed.) Addison-Wesley, Reading MA , 1995

*

Let's face it: the average programmer is a QWERTY programmer. He is stuck with old notations, like FORTRAN and COBOL. More importantly he has been thinking with two fingers, using the same mental tools that were used at the beginnings of computer science, in the 1940s and 1950s. True, “structured programming” has helped, but even that, by itself, is not enough. To put it simply, the mental tools available to programmers have been inadequate. David Gries: The science of programming. Springer-Verlag, New York, 1981.

*

The programmer should not ask how applicable the techniques of sound programming are, he should create a world in which they are applicable: it is his only way of delivering a high-quality design. Edsger W. Dijkstra: Answers to questions from students of Software Engineering ( EWD 1305)

*

Being a better programmer means being able to design more effective and trustworthy programs and knowing how to do that efficiently. It is about not wasting storage cells or machine cycles and about avoiding those complexities that increase the number of reasoning steps needed to keep the design under strict intellectual control. What is needed to achieve this goal, I can only describe as improving one's mathematical skills, where I use mathematics in the sense of “the art and science of effective reasoning”. Edsger W. Dijkstra: Why American Computing Science seems incurable ( EWD 1209)

*

OO is like the Bible in that which scripture is to be interpreted metaphorically, and which is to be interpreted literally, is entirely a function of the religious agenda of the commentator. My own advice is to keep in mind that the stuff of computer programs is nothing but metaphor. Thant Tessman in comp.object, August 2003

*

The object-oriented model makes it easy to build up programs byaccretion. What this often means, in practice, is that it provides a structured way to write spaghetti code. This is not necessarily bad, but it is not entirely good either. A lot of the code in the real worl is spaghetti code, and this is probably not going to change soon. For programs that would have ended up as spaghetti anyway, the object-oriented model is good: they will at least be structured spaghetti. But for programs that might otherwise have avoided this fate, object-oriented abstractions could be more dangerous than useful. Paul Graham: ANSI Common Lisp

*

Bricks are just too limited, and the circumstances where they make sense are too constrained to serve as a model for building something as diverse and unpredictable as a city. And further, the city itself is not the end goal, because the city must also –in the best case– be a humane structure for human activity, which requires a second set of levels of complexity and concerns. Using this metaphor to talk about future computing systems, it's fair to say that OO addresses concerns at the level of bricks. . Despite the early clear understanding of the nature of software development by OO pioneers, the current caretakers of the ideas have reverted to the incumbent philosophy of perfect planning, grand design, and omniscience inherited from Babbage's theology. Richard P. Gabriel: Objects have failed. OOPSLA 2002

C++ is like teenage sex: It's on everyone's mind all the time.

Everyone talks about it all the time.

Everyone thinks everyone else is doing it.

Almost no one is really doing it.

The few who are doing it are doing it poorly; sure it will be better next time; not practicing it safely.

allegedly a toilet graffito at the Technion CS department in Haifa, Israel, 1993-11-08 [ornate PS version]

*

When your hammer is C++, everything begins to look like a thumb. Steve Haflich in alt.lang.design, December 1994

*

Being really good at C++ is like being really good at using rocks to sharpen sticks. Thant Tessman in comp.lang.scheme, December 1996

*

(Of course SML does have its weaknesses, but by comparison, a discussion of C++'s strengths and flaws always sounds like an argument about whether one should face north or east when one is sacrificing one's goat to the rain god.) Thant Tessman in comp.lang.scheme, April 1997

*

the effort that has gone into extending and 'refining' C++ (and to its creation in the first place) must be largely attributed to the unwillingness of its advocates to give up the effort they've already put into learning and using it (and C before it). Strangely, this is simultaneously despite and because of the fact that C++ is not one of the easier languages to learn. Thant Tessman in comp.lang.c++.moderated, July 2000

*

As for C++ – well, it reminds me of the Soviet-era labor joke: “They pretend to pay us, and we pretend to work.” C++ pretends to provide an object-oriented data model, C++ programmers pretend to respect it, and everyone pretends that the code will work. The actual data model of C++ is exactly that of C, a single two-dimensional array of bits, eight by four billion, and all the syntactic sugar of C++ fundamentally cannot mask the gaping holes in its object model left by the cast operator and unconstrained address arithmetic. Guy L. Steele: Objects have not failed. OOPSLA 2002

*

My projection is that all language features, particularly the most powerful and expressive, cleanest and most beautiful, will eventually find their expression in C++ in the most broken, horrid, ugly, bastardized, barely recognizable form conceivable. Maybe I'm alone in my hope that C++ would meet its bitter end sooner rather than later, but the worrying trend seems to be that smart and well-meaning people keep falling into its seductive traps and perpetuating its farcical existence. Ben L. Titzer in Lambda the Ultimate, 2012-02-21

The C language (invented by Bell Labs — the people who were supposed to be building products with five 9's of reliability – 99.999%) then taught two entire generations of programmers to ignore buffer overflows, and nearly every other exceptional condition, as well. A famous paper in the Communications of the ACM found that nearly every Unix command (all written in C) could be made to fail (sometimes in spectacular ways) if given random characters (“line noise”) as input. And this after Unix became the de facto standard for workstations and had been in extensive commercial use for at least 10 years. The lauded “Microsoft programming tests” of the 1980's were designed to weed out anyone who was careful enough to check for buffer overflows, because they obviously didn't understand and appreciate the intricacies of the C language. I'm sorry to be politically incorrect, but for the ACM to then laud “C” and its inventors as a major advance in computer science has to rank right up there with Chamberlain's appeasement of Hitler. Henry Baker: “Buffer Overflow” security problems

*

Now I want to argue that worse-is-better is better. C is a programming language designed for writing Unix, and it was designed using the New Jersey approach. C is therefore a language for which it is easy to write a decent compiler, and it requires the programmer to write text that is easy for the compiler to interpret. Some have called C a fancy assembly language. Both early Unix and C compilers had simple structures, are easy to port, require few machine resources to run, and provide about 50%–80% of what you want from an operating system and programming language. Half the computers that exist at any point are worse than median (smaller or slower). Unix and C work fine on them. The worse-is-better philosophy means that implementation simplicity has highest priority, which means Unix and C are easy to port on such machines. Therefore, one expects that if the 50% functionality Unix and C support is satisfactory, they will start to appear everywhere. And they have, haven't they? Unix and C are the ultimate computer viruses. Richard P. Gabriel: Lisp - Good News, Bad News, How to Win Big

*

The most irritating part of Unix IMHO is not the design of the kernel (yeah, yeah it's a monolithic spaghetti ball) or the functionality of system calls (yeah, yeah, no PCLSR ing) or the unrecoverability of kernel panics, or whatever else is associated with the kernel and driver implementations. What's really irritating about the Unix design is all the institutionalized crufty software still floating around after thirty years of development, redesign, and redevelopment. Unix hackers have long spent time hacking on the hardware support, improving process scheduling, memory management, and the like, but they still live with an interface that feels just like 2.9BSD on a PDP-11/40, with some frills. It's disgusting. Everything from the init process on upwards is institutionalized, designed just like it was on the good old minicomputers. (I'm not degrading the Unix (or UN*X as it were) of that era, nor the machines it ran on, many of which I'm enamored of and wish I could own. I'm criticizing the stubborness of an operating system that dates from that era and appears to be little changed from it.) James A. Crippen in comp.lang.scheme, April 2000

*

Unix was not designed to support a serious artificial intelligence. It was designed to be an “operating system”, on the assumption that an “operating system” need not try to be an artificial life form. So Unix doesn't have the capabilities of one. It lacks a soul; it lacks reproductive objects from which it can make bootable upgrades. It fakes having orgasms, which works, but has to be slow. Unix is ok for writing open systems, but when you push it beyond that, it becomes Solaris. Stallard Richman: Why you should not use Unix.

*

“The wonderful thing about Unix standards is that there are so many to choose from.” You may be totally bewildered about the multitude of various standards that exist. Rest assured that nowhere in this manual will you encounter an attempt to spell it all out for you; you could not read and internalise such a twisted account without bleeding from the nose and ears. Olin Shivers (supplementing Andrew S. Tanenbaum): Scsh Reference Manual

*

Yes, dear?

No, I wasn't complaining....

Yes, I understand unix is the waveofthefuture....

Yes, unix is the equivalent of a programmer/sysadmin Full Employment Act....

No, lusers will never understand unix, so I'll always have a job if I learn it....

Yes, unix is free (and worth every cent)....

No, no two unix installations are alike, so there's plenty of job security for everyone....

No, nothing ever gets fixed in unix so there's more job security.... Joe Bednorz: Gawd I miss VMS in e.mail to Richard Levitte, March 1996

*

“bash awk grep perl sed df du, du-du du-du, vi troff su fsck rm * halt LART LART LART!” — the Swedish BOFH SD's signature, list.unix-haters in November 1998

*

Shell programming terrifies me. There is something about writing a simple shell script that is just much, much more unpleasant than writing a simple C program, or a simple COMMON LISP program, or a simple Mips assembler program. Is it trying to remember what the rules are for all the different quotes? Is it having to look up the multi-phased interaction between filename expansion, shell variables, quotation, backslashes and alias expansion? Maybe it's having to subsequently look up which of the twenty or thirty flags I need for my grep, sed, and awk invocations. Maybe it just gets on my nerves that I have to run two complete programs simply to count the number of files in a directory ( ls | wc -l ), which seems like several orders of magnitude more cycles than was really needed. Whatever it is, it's an object lesson in angst. Olin Shivers: A Scheme Shell

*

If the designers of X-Windows built cars, there would be no fewer than five steering wheels hidden about the cockpit, none of which followed the same principles — but you'd be able to shift gears with your car stereo. Useful feature, that. Marcus J. Ranum: motto chosen by Don Hopkins

*

It's criminally negligent to ship a product that is incapable of keeping the input focus up to date with the cursor position, when you have the technology to do so. Your xtrek has paged the window manager out of core, and the console beeps and you suddenly need to move the cursor into the terminal emulator and type the command to keep the reactor from melting down, but the input focus stays in the xtrek for three seconds while the window manager pages in, but you keep on typing, and the keys slip right through to xtrek, and you accidentally fire off your last photon torpedoe and beam twelve red shirt engineers into deep space! Don Hopkins: Window Manager Flames

Perl's gluing ability goes beyond computation, to people. To the poor and have-nots. It unites people in the computing field who are not endowed with fancy engaging brains. It is the sanctuary of dunces. The expressions of those thoughtless. The godsend for brainless coders. The means and banner of sys admins. The lingua franca of trial-and-error hackers. The song and dance of stultified engineers. I'm also a Perler. Share a secret with me: When you are cornered by mathematicians or the like, who are about to speak lamba or something we don't understand, what do you do? Of course, flip out the little Swiss Army Knives in our pockets, and splutter #%$@ syntaxes that is equally abstruse, and we feel safe and secure. Fuck geniuses in this world. Leave Perlers along. Larry Wall for President. The three principal virtues of . . . I think comrade glauber is incorrect. First of all, he got our mantra wrong. It is: The three characteristics of Perl programers: mundaneness, sloppiness, and fatuousness. Secondly, our language is not evolved to support no fucking real life no shit. Our language, is designed to be a fuckup from the very beginning. Designed, to fuck up those computer scientists. Fuck up their teachings. Fuck up their students. Fuck up their language. Fuck up their correctness. Fuck up their fucking theoretical theories. Remember, P is for Practical. Xah Lee in comp.lang.lisp, December 2000

*

Perl did some things well: It transcended implementation differences by staring them in the eye and fighting it out, not by giving up, whining that something isn't standard and portable, etc. It gave the bad standards and their nonsensical implementation differences the finger and wrote its own standard. For the kinds of tasks Perl does well, it is downright impressively portable. You need the ever expanding Perl book, but not the tens or hundreds of shelf-feet of manuals that you used to have to deal with. Perl has created its own operating system interface precisely by being an incredibly ugly implementation, and I'll give Larry Wall this, but not much else: He did fully understand the value of uniform external behavior of a tool and he was willing to pay the price to get it. There is no excuse for the language he created in order to do this, however. Erik Naggum in comp.lang.lisp, November 2000

*

the perl programmer who veers off the road into the forest will get out of his car and cut down each and every tree that blocks his progress, then drive a few meters and repeat the whole process. whether he gets where he wanted to go or not is immaterial — a perl programmer will happily keep moving forward and look busy. . it's not that perl programmers are idiots, it's that the language rewards idiotic behavior in a way that no other language or tool has ever done, Erik Naggum in comp.lang.lisp, March 2000

*

What really pisses me off with Perl is that people work so hard doing so terribly little while they think they have worked very little doing something really nifty. Fools! Erik Naggum in comp.lang.lisp, October 2000

Software Engineering is Programming when you can't. . We must give industry not what it wants, but what it needs. Edsger W. Dijkstra quoted in the program (10M .pdf) of his birthday symposium, Austin TX , 2000

*

Another series of [philosopher's] stones in the form of “programming tools” is produced under the banner of “software engineering”, which, as time went by, has sought to replace intellectual discipline by management discipline to the extent that it has now accepted as its charter “How to program if you cannot”. Edsger W. Dijkstra: The threats to computer science ( EWD 898)

*

In the mean time, software engineering has become an almost empty term, as was nicely demonstrated by Data General who overnight promoted all its programmers to the exalted rank of “software engineer”! But for the managing community it was a godsend which now covers a brew of management, budgeting, sales, advertising and other forms of applied psychology. Ours is the task to remember (and to remind) that in what is now called “software engineer”, not a single sound egineering principle is involved. (On the contrary, its spokesmen take the trouble of arguing the irrelevance of the engineering principles known.) Software Engineering as it is today is just humbug; from an academic –i.e. scientific and educational– point of view it is a sham, a fraud. Edsger W. Dijkstra: There is still a war going on ( EWD 1165)

*

The required techniques of effective reasoning are pretty formal, but as long as programming is done by people that don't master them, the software crisis will remain with us and will be considered an incurable disease. And you know what diseases do: they invite the quacks and charlatans in, who in this case take the form of Software Engineering Gurus. . In the software business there are many enterprises for which it is not clear that science can help them; that science should try is not clear either. Edsger W. Dijkstra: Answers to questions from students of Software Engineering ( EWD 1305)

*

In a cruel twist of history, however, American society has chosen precisely the 20th Century to become more and more a-mathematical (...), and we have reached the paradoxical state that, of all so-called developed nations, the USA is the most dependent on programmed computers and intellectually the worst equipped to be so. The suggestion that the programming problem could be amenable to mathematical treatment is, if heard at all, instantaneously rejected as being totally unrealistic. As a result, Program Design is prevented from becoming a subdiscipline of Computing Science. (...) And in the mean time, programming methodology –renamed “software engineering”– has become the happy hunting-ground for the gurus and the quacks. Edsger W. Dijkstra: Why American Computing Science seems incurable ( EWD 1209)

*

The problems of business administration in general and data base management in particular are much too difficult for people that think in IBMerese, compounded with sloppy English. Edsger W. Dijkstra: How do we tell truths that might hurt? ( EWD 498)

*

With the caveat that there's no reason anybody should care about the opinions of a computer scientist/mathematician like me regarding software development, let me just say that almost everything I've ever heard associated with the term “extreme programming” sounds like exactly the wrong way to go … with one exception. The exception is the idea of working in teams and reading each other's code. That idea is crucial, and it might even mask out all the terrible aspects of extreme programming that alarm me. I also must confess to a strong bias against the fashion for reusable code. To me, “re-editable code” is much, much better than an untouchable black box or toolkit. I could go on and on about this. If you're totally convinced that reusable code is wonderful, I probably won't be able to sway you anyway, but you'll never convince me that reusable code isn't mostly a menace. Donald E. Knuth interviewed by Andrew Binstock; 2008-04-25

*

Robert and I both knew Lisp well, and we couldn't see any reason not to trust our instincts and go with Lisp. We knew that everyone else was writing their software in C++ or Perl. But we also knew that that didn't mean anything. If you chose technology that way, you'd be running Windows. . During the years we worked on Viaweb I read a lot of job descriptions. A new competitor seemed to emerge out of the woodwork every month or so. The first thing I would do, after checking to see if they had a live online demo, was look at their job listings. After a couple years of this I could tell which companies to worry about and which not to. The more of an IT flavor the job descriptions had, the less dangerous the company was. The safest kind were the ones that wanted Oracle experience. You never had to worry about those. You were also safe if they said they wanted C++ or Java developers. If they wanted Perl or Python programmers, that would be a bit frightening — that's starting to sound like a company where the technical side, at least, is run by real hackers. If I had ever seen a job posting looking for Lisp hackers, I would have been really worried. Paul Graham: Beating the Averages

*

While we were doing Viaweb, we took a good deal of heat from pseudo-technical people like VC s and industry analysts for not using a database — and for using cheap Intel boxes as servers, running FreeBSD. But when we were getting bought by Yahoo, we found that they also just stored everything in files — and all their servers were also cheap Intel boxes running FreeBSD. (During the Bubble, Oracle used to run ads saying that Yahoo used Oracle software. I found this hard to believe, so I asked around. It turned out the Yahoo accounting department used Oracle.) Paul Graham on database-backed web applications

*

A company that made programmers wear suits would have something deeply wrong with it. Paul Graham: What the Bubble Got Right

*

Technology is part of the answer, not part of the question. Don't make choices only to then try to figure out how to twist the problem in such a way so as to fit your choice. This will often result in your solution being more convoluted than my previous sentence. Curtis Poe: Finding Technology Solutions, Perl Monks 2001-07-23

*

Those who use databases usually do so because they master no data structures of their own. . A few years from now, programming will have been revolutionized and lots and lots of work will be done by software that writes itself. This will require massive talent and intelligence and thinking outside the box and what have you, but for the time being, programming is a “consumer” job, “assembly line” coding is the norm, and what little exciting stuff is being performed is not going to make it compared to the mass-marketed crap sold by those who think they can surf on the previous half-century's worth of inventions forever. This will change, however, and those who know Common Lisp will be relieved of reinventing it, like the rest are doing, even in the “tools” world, badly. Erik Naggum in comp.lang.lisp, June 2001

*

<soapbox> I'm wary of managers who want to “comprehend easily” any arbitrarily complicated topic. take operating systems. managers choose Microsoft because everybody else does. take programming languages. managers choose C++ because everybody else does. take design methodologies. managers choose object-orientation because everybody else does. thusly chosen, they will lead to disaster and managers will learn that they have made a mistake and go on to choose another mistake exactly the same way. if they succeed, it's pure accident, and never to their credit, except for hiring technically sound people who are willing to work for people who will take the credit for their work. this is the software crisis, if you ask me. </soapbox> Erik Naggum in comp.text.sgml, December 1994

*

there is no aspect of database management that is not characterized by lack of foundation knowledge and riddled with confusion (...) As long as the industry neither requires, nor rewards knowledge of fundamentals, why should we expect anything else? Fabian Pascal Responds, On Denormalization and Repeating Groups

*

[on the enterprise service bus ( ESB )] ESB -oriented architecture is inherently flawed in that it builds connectivity no one might ever want to use. The business does not derive additional value until systems connect to each other and are working together. Until then, the ESB is just cost with no benefit. It might make the IT department feel good because they've built something, but it won't make the business feel any better, because the business isn't accomplishing anything it couldn't have already accomplished without the ESB . The ESB becomes the equivalent of a human appendix for the IT department, a vestigial organ within the topology of deployed applications. ESB -oriented architecture : The wrong approach to adopting SOA ,

by Bobby Woolf

*

there's nothing wrong with XML and web services you couldn't fix by removing the XML and web services parts. Re: Simplifying Email With Web Services, by sammybaby

@ The Daily WTF in 2006-03-08

People assume that computer technology moves forward at a rapid clip, yet no eyebrows were raised when Apple said that its big step forward was going to be licensing the NeXT OS. This is Steve Jobs's late 1980s facelift of Carnegie-Mellon University's early 1980s rewrite (Mach) of Bell Lab's early 1970s Unix operating system. Maybe it is better than Windows NT but, if so, that only makes it a more damning condemnation of the software industry. Shortly before Apple acquired NeXT, I'd had a foot operation. I disclosed in the pre-op interview that I'd been taking aspirin. The hospital wanted to make sure that my blood would clot adequately so they brought in a phlebotomist who applied what looked like a self-inking rubber stamp to my forearm. Blood soon began to flow. I'd been a regular blood donor and had never fainted or thrown up, but somehow the sight of my blood just oozing out onto my arm was more sickening than collecting in a bag. I managed to control my nausea for the first five minutes but then the phlebotomist got bored and asked me what I did for a living. “Oh, you're in computers? Do you know that really smart guy?”

I thought for a moment. “Do you mean Bill Gates?”

“Yes, that's the one. What did he invent?” Philip Greenspun: Envisioning a Site That Won't Be Featured In suck.com

*

there is only one solution: do not use Microsoft products. do not expose yourself to anything they do. the day will come when it is much more important in presidential campaigns and to carreers in general that you have not used Microsoft than that you have not taken certain drugs. the day will come when we figure out which planet Bill Gates was thrown out of and then we can go blow it up. in the meantime, resistance is not futile. only the weak of mind will be assimilated, and they are no loss, anyway. “Microsoft hired his head. It's in a jar in Redmond.” — Dilbert Erik Naggum in comp.lang.lisp, October 1998

*

if you believe Microsoft's propaganda and you wind up ripped off and naked, do you become an emperor? Erik Naggum in comp.lang.lisp, December 1998

*

The best approach is to get a suitable hand-gun and use your Microsoft and C++ books for target practice. That gives you control, again. Then start anew with a real language and tools intended for people to use, not whatever semi-evolved simian the commercials work on. Erik Naggum in comp.lang.lisp, October 2000

Robbery is not just another way of making a living, rape is not just another way of satisfying basic human needs, torture is not just another way of interrogation. And XML is not just another way of writing S-exps. There are some things in life that you do not do if you want to be a moral being and feel proud of what you have accomplished. . If GML was an infant, SGML is the bright youngster far exceeds expectations and made its parents too proud, but XML is the drug-addicted gang member who had committed his first murder before he had sex, which was rape. . The question of what we humans need to read and write no longer has any bearing on what the computers need to work with. One of the most heinous crimes against computing machinery is therefore to force them to parse XML when all they want is the binary data. As an example, think of the Internet Protocol and Transmission Control Protocol in XML terms. Implementors of SNMP regularly complained that parsing the ASN.1 encodings took a disproportionate amount of processing time, but they also acknowledged that properly done, it mapped directly to the values they needed to exchange. Now, think of what would have happened had it not been a Simple, but instead some moronic excuse for an eXtensible Network Management Protocol. Erik Naggum in comp.lang.lisp, December 2002

*

VMS is like a Soviet railroad train. It's basically industrial-strength, but when you look at it closely, everything's a little more shabby than you might like. It gets the job done, but there's no grace to it. The Mac operating system is like the monorail at Disney World. It's kind of spectacular and fun, but it doesn't go much of anywhere. Still, the kids like it. Unix is like the maritime transit system in an impoverished country. The ferryboats are dangerous as hell, offer no protection from the weather and leak like sieves. Every monsoon season a couple of them capsize and drown all the passengers, but people still line up for them and crowd aboard. MS-DOS is like the US rail system. It's there, but people just ignore it and find other ways of getting where they want to go. Posted by Paul A. Vixie to rec.humor in March 1991

*

UNIX: you think it won't work, but if you find the right wizard, you can make it work. Macintosh: you think it will work, but it won't. PC /Windows: you think it won't work and it won't. Philip Greenspun's personal viewpoint

*

Greenspun's Tenth Rule of Programming: “Any sufficiently complicated C or Fortran program contains an ad-hoc, informally-specified bug-ridden slow implementation of half of Common Lisp.” Philip Greenspun—but where?

*

You're posting to a Scheme group. Around here, arguing that Java is better than C++ is like arguing that grasshoppers taste better than tree bark. Thant Tessman in comp.lang.scheme, June 2000

*

Also, Scheme is more fun to program in because Scheme tends to let you 'feel' your way to a working program, whereas SML forces you to think your way there. But we have to remember that they're both so far beyond C++ that it's silly to argue about them. Thant Tessman in comp.lang.scheme, July 1998

*

The ML camp is too much into the “we'll dictate the semantic framework” mindset, and the Scheme camp is too much into the “We won't dictate anything at all—heck we won't even provide anything—you're on your own” mindset. Paul R. Wilson in comp.lang.scheme, July 1998

*

I always tell my students that Scheme is my second favorite programming language. After they recover from this statement, they naturally always ask what my favorite language is, to which I respond with “I am still working on it.” That is, really experienced programmers always try to improve on their major mode of thought. Matthias Felleisen in comp.lang.scheme, October 2003

*

Sun's efforts with Java showed how amateurish and underfinanced every previous effort to establish a (more or less) general-purpose language had been. The U.S. Department of Defense's efforts to establish Ada as a dominant language was a sharp contrast, as were the unfinanced efforts by me and my friends to establish C++.

I can't say that I approve of some of the Java tactics, such as selling top-down to nonprogramming executives, but it shows what can be done. Bjarne Stroustrup: C++ [.pdf 506 KB]; chapter in Biancuzzi / Warden: Masterminds of Programming. Conversations with the Creators …

The clearest moral of the above is never to introduce terms (like “disposable”) on the basis of “you know I mean, don't you?”. To hell with the “meaningful identifiers”! Edsger W. Dijkstra: To hell with “meaningful identifiers”! ( EWD 1044)

*

[On the recommendation to “initialize every variable as soon as it comes into scope”]

This is one of the totally mindless rules which in practice leads to bad code. I am sick of seeing C code which initialises variables to values that never get used; that is lying to the reader, and lying to the reader is never good idea. Variables should only ever be initialised when you have a value that you intend to use that you can initialise them with. Richard A. O'Keefe in squeak-dev mailing list, September 2003

*

If someone didn't understand their code and its likely uses well enough to write brief useful comments, why should I imagine that they understood it well enough to write code that works? . It's a really dreadful name! Compatibility is the only excuse I'd accept for putting up with such a misleading and unhelpful name. But it's a good reason, within reason. Richard A. O'Keefe in squeak-dev mailing list, June 2003

*

*

Realistically, the practice of putting untested code into systems is common, and so are system failures. The excuse I've most often heard for putting in untested code is that there wasn't enough time or money left to do the testing. If there wasn't enough time and money to test the routine, then there wasn't enough time and money to create it in the first place. What you think is code, before it has been properly tested, is not code, but the mere promise of code — not a program, but a perverse parody of a program. If you put such junk into a system, its bugs will show, and because there hasn't been a rigorous unit test, you'll have a difficult time finding the bugs. As Hannah Cowley said, “Vanity, like murder, will out.” For it's vanity to think that untested code has no bugs, and murder to put such code in. It is better to leave out untested code altogether than to put it in. Code that doesn't exist can't corrupt good code. A function that hasn't been implemented is known not to work. An untested function may or may not work itself (probably not), but it can make other things fail that would otherwise work. In case I haven't made myself clear, leaving untested code in a system is stupid, shortsighted, and irresponsible. . GIGO (“Garbage-in equals garbage-out”) is no explanation for anything except our failure to test the system's tolerance for bad data. Garbage shouldn't get in — not in the first place or in the last place. Every system must contend with a bewildering array of internal and external garbage, and if you don't think the world is hostile, how do you plan to cope with alpha particles? . But to be really diabolical takes organization, structure, discipline, and method. Taking random potshots and waiting for inspiration with which to victimize the programmer won't do the job. Syntax testing is a primary tool of dirty testing, and method beats sadism every time. . A good threat is worth a thousand tests. Boris Beizer: Software Testing Techniques 2E. Van Nostrand Reinhold, New York 1990

*

Tests are the Programmer's Stone, transmuting fear into boredom. Kent Beck: Test-Driven Development. Addison-Wesley, Boston 2002

*

Finally, the idioms of a language are useful as a sociological exercise (“How do the natives of this linguistic terrain cook up a Web script?”), but it's dangerous to glean too much from them. Idioms are fundamentally human, therefore bearing all the perils of faulty, incomplete and sometimes even outlandish human understanding. Shriram Krishnamurthi: Programming Languages: Application and Interpretation

*

If you're focused on making it look “lispy”, IMO , you're off track. The real lisp programs I have any respect for look like their “application domain”, not like other programs in the language—unless those programs happen to be about the same domain. Kent Pitman in comp.lang.lisp, December 2001

*

disturb lisp as little as possible. In spirit, a program ought to be as much as possible like a modification of the language, rather than a separate application written in it. Making programs harmonize with Lisp makes them more robust, like a machine whose parts fit together well. It also saves effort, sometimes you can make Lisp do a surprising amount of work for you. Paul Graham: On Lisp

[ Q: who's doing the better job – academia or industry? A: ] Depends on what you want. If you want a great deal of hot air, some papers nobody will read, and software that's theoretically interesting but impractical, produced at moderate expense before being abandoned, academia is the place to be. On the other hand, if you want a poorly designed piece of junk that solves a trivial problem badly (if it works at all), produced very slowly at breathtakingly great expense, go for industry. As a third option, if you need software that solves a difficult problem well, but with a bafflingly arcane interface and no documentation, produced at minimal expense, try open-source software. It's not a matter of better or worse, really; each approach has its own uniquely valuable way of failing. Casey McCann in λ the ultimate, June 2010

*

I hope very much that computing science at large will become more mature, as I am annoyed by two phenomena that both strike me as symptoms of immaturity. The one is the widespread sensitivity to fads and fashions, and the wholesale adoption of buzzwords and even buzznotions. Write a paper promising salvation, make it a “structured” something or a “virtual” something, or “abstract”, “distributed” or “higher-order” or “applicative” and you can almost be certain of having started a new cult. The other one is the sensitivity to the market place, the unchallenged assumption that industrial products, just because they are there, become by their mere existence a topic worthy of scientific attention, no matter how grave the mistakes the embody. Edsger W. Dijkstra: My Hopes of Computing Science ( EWD 709)

*

I pray daily that more of my fellow-programmers may find the means of freeing themselves from the curse of compatibility. Edsger W. Dijkstra: The humble programmer (Turing award lecture, EWD 340)

*

(Personally, I think the world could benefit from an International League for the Derision of User-Friendliness.) Edsger W. Dijkstra: The threats to computer science ( EWD 898)

*

A computer “user” isn't a real person of flesh and blood, with passions and brains. No, he is a mythical figure, and not a very pleasant one either. A kind of mongrel with money but without taste, an ugly caricature that is very uninspiring to work for. He is, as a matter of fact, such an uninspiring idiot that his stupidity alone is sufficient explanation for the ugliness of most computer systems. And oh! Is he uneducated! That is perhaps his most depressing characteristic. He is equally education-resistant as another equally mythical bore, the “average programmer”, whose solid stupidity is the greatest barrier to progress in programming. It is a sad thought that large sections of computing science are effectively paralyzed by the narrow-mindedness and other grotesque limitations with which a poor literature has endowed these influential mythical figures. Edsger W. Dijkstra: On Webster, Users, Bugs, and Aristotle ( EWD 618)

*

User-friendliness is a word that never should have been invented. . The computer user, as functioning in the development of computer products is not a real person of flesh and blood but a literary figure, the creation of literature, rather poor literature. (...) Now, if you start to analyze the many character traits of that literary figure, you discover that he is most uninspiring. He is stupid, education resistant if not education proof, and he hates any form of intellectual demand made on him, he cannot be delighted by something beautiful, because he lacks the education to appreciate beauty. Large sections of computer science are paralyzed by accepting this moron as their typical customer. . What is most needed is, at a number of good universities, a few strong departments of unfashionable computer science. Edsger W. Dijkstra interviewed by Rogier F. van Vlissingen, 1985

*

Java seems to have been designed to strike a careful balance between making the type system as obstructive as possible while minimizing any actual guarantees of correctness. Casey McCann in λ the ultimate, May 2010

*

Yes, as a name, xnor generalises well to the n-ary case: I'm confused completely independent of the number of arguments passed to the function. Olin Shivers in the SRFI-33 discussion archive

*

The other month I was told of a great invention called “the register window”. My spokesman was young but in my ears it sounded very familiar because I remembered the Burroughs B5000 of 30 years ago. So, if you have a bright and sound idea now, you can expect it to be hailed as novelty around the year 2015. Edsger W. Dijkstra: The next forty years ( EWD 1051)

*

The main idea here is that in the end the programmers want to program in a very high level language, and the machine should be as configurable as possible towards helping the best conceived environment run as fast as possible. A secondary idea is that it is hard to design when you have your optimization hat on, and thus, if you want to make progress with interactive language design, you want to be able to start using your latest and greatest ideas with as little special optimization as possible. These are not goals that Intel and Motorola understood, anymore than they understood anything important about SW in general. The current caching schemes are rudimentary to say the least. The more interesting architectures today are the graphics accelerators — they don't do anything particularly new, but they at least have some notion of what they are supposed to do (and also what they don't have to do when Moore's Law makes it easy to have multiple processors). Alan Kay in squeak-dev mailing list, March 2003

*

In fact, flow charting is more preached than practiced. I have never seen an experienced programmer who routinely made detailed flow charts before beginning to write programs. Where organization standards require flow charts, these are almost invariably done after the fact. Many shops proudly use machine programs to generate this “indispensable design tool” from the completed code. I think this universal experience is not an embarassing and deplorable departure from good practice, to be acknowledged only with a nervous laugh. Instead it is the application of good judgment, and it teaches us something about the utility of flow charts. Frederick P. Brooks, Jr.: The Mythical Man-Month. Addison-Wesley, Reading MA , 1995 (anniversary ed.)

*

A favorite subject for Ph.D. dissertations in software engineering is graphical, or visual, programming, the application of computer graphics to software design. Sometimes the promise of such an approach is postulated from the analogy with VLSI chip design, where computer graphics plays so fruitful a role. Sometimes the approach is justified by considering flow charts as the ideal programming design medium, and providing powerful facilities for constructing them. Nothing even convincing, much less exciting, has yet emerged from such efforts. I am persuaded that nothing will. In the first place, as I have argued elsewhere, the flow chart is a very poor abstraction of software structure, Indeed, it is best viewed as Burkes, von Neumann, and Goldstine's attempt to provide a desperately needed high-level control language for their proposed computer. In the pitiful, multipage, connection-boxed form to which the flow chart has today been elaborated, it has proved to be essentially useless as a design tool — programmers draw flow charts after, not before, writing the programs they describe. . Whereas the difference between poor conceptual designs and great ones may lie in the soundness of design method, the difference between good designs and great ones surely does not. Great designs come from great designers. Software construction is a creative process. Sound methodology can empower and liberate the creative mind, it cannot enflame or inspire the drudge. Frederick P. Brooks, Jr.: No Silver Bullet — Essence and Accident in Software Engineering. In: The Mythical Man-Month (anniversary ed.) Addison-Wesley, Reading MA , 1995

*

Flowcharts have been falling out of favor for over a decade, and before another decade passes they'll be regarded as curious, archaic relics of a bygone programming era. Indeed, flowcharts are rarely used today; they're created mainly to satisfy obsolete documentation specifications. . I think that if you were to gather statistics over one big project or many small projects (that used the same paper), then the weight of the listings would correlate as well to the bug rate and test efforts as do lines of code. Yet, “lines of code” sounds reasonable and scientific, and “listing weight” seems to be an outrageous put-on. Who's putting whom on? The fact is, that it makes exactly as much sense (or nonsense) to say “This is a 230-gram program” as it does to say “This is a 500-line program.” . “Logic” is one of the most often used words in programmers' vocabularies but one of their least used techniques. . The barrier to reusable software has never been technical — it's been financial and managerial. Reusable software doesn't get built if no one is willing to pay the price. New languages and programming techniques won't do it — they never have. It takes bucks and guts. Boris Beizer: Software Testing Techniques 2E. Van Nostrand Reinhold, New York 1990

*

I have been told that one of the reasons for the longevity of the Roman bridges is that their designers had to stand under them when they were first used. It may be time to put a similar discipline into the software field. Henry Baker: “Buffer Overflow” security problems

*

We have gotten OK at building cathedrals — only we are building the same dozen or so over and over again. . The effect of ownership imperatives has caused there to be no body of software as literature. It is as if all writers had their own private companies and only people in the Melville company could read Moby-Dick and only those in Hemingway's could read The Sun Also Rises. Richard P. Gabriel and Ron Goldman: Mob Software: The Erotic Life of Code

*

$20 / paper for any research that doesn't have a business case and a deep-pocketed backer is completely untenable, and speculative or historic research that might require reading dozens of papers to shed some light on longstanding questions is basically impossible. There might have been a time when this was OK and everyone who had access to or cared about computers was already an IEEE/ACM member, but right now the IEEE – both as a knowledge repository and a social network – is a single point of a lot of silent failure. “$20 for a forty-year-old research paper” is functionally indistinguishable from “gone”, Mike Hoye on why old research papers should start at $0

*

(I have found that the worst possible thing you could do wrong in this world is to give people something that is more powerful than they are prepared to understand. Back when I believed that SGML was a brilliant idea, I did not understand that the people who were the intended users were completely unable to understand it, and that only those who were stupid enough not to realize it in time, would continue to work with it, and so they sat there with their excellent document production system with a clever markup system and thought it had to be useful for something grander, and now we have XML , a non-solution to a non-problem so brilliant that m4 no longer seems like a prank. We really need Gary Larson- style cartoons on the history of computer science.) Erik Naggum in comp.lang.lisp, January 2004

*

HTML represents the worst of two worlds. We could have taken a formatting language and added hypertext anchors so that users had beautifully designed documents on their desktops. We could have developed a powerful document structure language so that browsers could automatically do intelligent things with Web documents. What we have got with HTML is ugly documents without formatting or structural information. Philip Greenspun: We Have Chosen Shame and Will Get War

*

Some of the patterns books I have read stress the language-independence of the patterns. This turns out to mean “hey, we need it for Java too!” Richard A. O'Keefe in squeak-dev mailing list, November 2002

*

It's hard to make things fool-proof because fools are so ingenious. I remember someone telling me about a statistics package they had written which would only let you do meaningful regressions. Instead of being glad to have errors caught before it was too late, their users had discovered that they could save data to a file and load it back in as if it was completely new data, thereby tricking the program into accepting calculations which made no sense. Richard A. O'Keefe in squeak-dev mailing list, December 2002

*

Aspiring computer scientists should know that they will not be glorified toaster repairmen, but part of a grand intellectual tradition stretching back to Euclid, Leibniz, and Gauss. They should know that, in Dijkstra's words, “computer science is no more about computers than astronomy is about telescopes”—that the quarry we're after is not a slightly-faster C compiler but a deeper understanding of life, mind, mathematics, and the physical world. Scott Aaronson : Teaching Statement [.pdf]

*

You are crazy, but that's not important. The only thing that matters is whether or not you do anything. Do anything, and you matter. Olin Shivers in comp.lang.scheme, April 2000

*

If people are sane, if the world keeps moving forward, if we intellectuals keep fulfilling our duty of education, then good things will spread, and happiness will rain on earth. Xah Lee in comp.lang.lisp, July 2000

Edsger W. Dijkstra was the intransigent champion of calculational methods and academic independence, who spurned operational and analogical thinking in progamming, real-worldly concerns in research and education. His classic Discipline of Programming proposes a model programming language and a treatment of semantics tailored to support the derivation of both (small) programs and proofs of their correctness, the proof guiding the programming choices. (See Cohen's Programming in the 1990s for a cute textbook introducing the “calculation of programs”.) He entertained an unshakable belief in formalization, in calculi applied as tools rather than investigated as subject matter of science. Apparently he didn't want to understand the semantic side of mathematical logic (model theory, initiated in the 1930s and 40s by Tarski and Carnap):

I get completely confused as soon as logicians drag “models” and “interpretations” into the picture. I thought that the whole purpose of the creation of a formal system in which one can calculate was to create something independent of any model or interpretation. (EWD 1227)

(Actually, he tends to blur, in his Discipline at least, the distinction between formulae defining relations and their interpretation as subsets of the state space, and when the author seems to present manipulations of formulae we may more straightforwardly read operations on sets of states.)

His infatuation with “calculating” may have had a whimsical twist to it, and some of his aversions are rather dated: the operational semantics he execrated has seen a beautiful —syntax-directed— rebirth in the last decades, simple calculi exhibiting core features of various programming languages. Nonetheless, the late Edsger W. Dijkstra is the patron saint of ratiocination in computing:

I mean, if 10 years from now, when you are doing something quick and dirty, you suddenly visualize that I am looking over your shoulders and say to yourself “Dijkstra would not have liked this”, well, that would be enough immortality for me. (EWD 1213)

If I lived back in the wild west days, instead of carrying a six-gun in my holster, I'd carry a soldering iron. That way, if some smart-aleck cowboy said something like “Hey, look. He's carrying a soldering iron!” and started laughing, and everybody else started laughing, I could just say, “That's right, it's a soldering iron. The soldering iron of justice.” Then everybody would get real quiet and ashamed, because they had made fun of the soldering iron of justice, and I could probably hit them up for a free drink. Deep Thoughts by Jack Handey

Well, I am not competent at all to give good answers to such questions, but such worries, some other doubts raised here about monads, etc. remind me a little one conference on geometrical methods in technical sciences, where one Top Fellow kept asking several times the same question: “How much money was produced last year by fibre bundles or differential forms?” Jerzy Karczmarczuk: comp.lang.functional, April 1999

*

The only paradigm worth following in programming is that of money. That is the only reason the vast majority of programmers got into it for. Well, that and all of the hot chicks...... SmashAndGrab @ The Daily WTF, 2007-02-07