Larry O'Brien talks to Grady Booch about the 15th anniversary of Design Patterns, the wicked problems of developing in the multicore era, what programming languages he's using now, and the best coffee.



Larry O'Brien: The year is 1994. Forrest Gump rules the box office, the Republicans take control of the legislature, and Britney Spears is just another Mouseketeer… What is happening in the software development world? What are people talking about and struggling with?

Grady: Well, in 1994, the Web was just in its infancy (see http://blogoscoped.com/archive/2007-07-11-n32.html for example). There was none of the dynamic content we take for granted today, no meaningful search mechanisms, and very few companies had any idea how to use this new fangled thing. Netscape had just been founded, OS/2 was released, Yahoo was then founded, Microsoft had just released Windows 3.11, PHP had its start, the Intel 486 was still very much in force, the object-oriented method wars were just coming to a head (we hired Jim Rumbaugh in late 1994, and Ivar [Jacobson] came in the next year). People were struggling with adopting object-oriented languages, most notably C++ (though Ada was still in play) and there was a noticeable knee in the curve of software complexity. Waterfall vs. iterative methods was very much a strong topic of debate. My personal computer at the time, by the way, was a PowerMac 8100 with a whopping 16Mb main memory.

In all, it was a really fun time to be in software, for so much was in motion and there was so much possibility.

Not unlike, I should add, today. Although we are in a period of economic scarcity, this is still very much a time of software abundance, with still so much in motion and still so much possibility.

Larry: Your book Object-Oriented Design with Applications had already gone through two editions, Ivar Jacobson’s use-case book had come out, Rumbaugh’s book had been out: A lot of people were talking about analysis and design. What did Design Patterns bring to the table that caused it to be received so well?

Grady: I have always described the history of software development as one of growing levels of abstraction, which we see manifest in our languages, our methods, our processes, our platforms. Design patterns were the next (now obvious) steps along that path. Kent Beck and I had sponsored a retreat in the summer of 1993 to bring together folks interested in this space (this was the genesis of the Hillside Group) and I knew then that being able to name societies of classes that collaborated was the Right Thing.

Larry: You initially became well known from your work with Ada, a language that people don’t discuss much today. Are there aspects of that language or programming system that you wish had a higher profile today?

Grady: You may be surprised to know that Ada is still alive; I'm actually engaged on a satellite project that's producing about a half million lines of new Ada (which is not uncommon for such satellite systems). In retrospect, Ada was ahead of its time, with packaging, generics, exception handling, and its tasking mechanisms—elements that continue to be show up in contemporary practices. Given that the frequency scaling wars are over, we see the shift to multicore...and therein I think that Ada's tasking mechanisms are still very germane.

Larry: When comparing OODwA or Design Patterns/GoF with today’s design discussions, one thing that jumps out is that today there’s little discussion of class and component structures and relationships, while there’s a great emphasis on dynamic structures and behaviors. Is this a change in practice (e.g., OOP is dying out and people are moving towards functional approaches) or just a change in dialogue (e.g., OOP and patterns are so ingrained now that they needn’t be emphasized)?

Grady: I think the change you see is because the notion of a class as a fundamental abstraction is so fully a part of the DNA of contemporary development that it's just taken for granted. Remember, when those works came out, the idea of an object as an abstraction was a profoundly disturbing and startling idea to many. However, this is not to say that these kinds of abstractions are any easier today, for virtually all the dominant industrial programming languages have classes at their core (including scripting languages such as Ruby, PHP, and JavaScript). The fundaments of building crisp abstractions with clear separations of concern and a balanced distribution of responsibilities is as much an issue today as it was then.

Larry: You began advocating a “4+1” view of architecture before the Web became the dominant issue in software development that it is today. For those developing for the Web, who know that they’re going to be dealing with JavaScript, AJAX, and browser compatibility and who know that they will have a relatively slow transport layer, a server-side domain component, a back-end database, etc., aren’t the architectural elements of such a system established?

Grady: The technology elements of their platform are established, but not their architecture. This is not unlike saying that an artist who works in clay has their domain fully architected just as much as an artist who works in oils. One of the myths that Philippe Kruchen notes (and btw, it was he who coined the 4+1 model view, not me) is that “my technology x” is my architecture. What you say is mostly true about developing in the domain of the Web, but there's a whole lot of architecture yet to be done.... Those things you mention are just the context within which one architects. In fact, this is a good thing. An artist or a writer faced with a completely blank slate is often less innovative then one who is somehow constrained. It just so happens that the Web has come together so serendipitously that we have a veritable rich primordial soup of stuff from which new life forms are still appearing.

Larry: So even when you're using a framework like Spring or ASP.NET MVC or Rails, it's still important to address architecture? What sorts of architectural questions do these "opinionated" frameworks leave open?

Grady: Those things are just part of the plumbing...one still has architectural decisions to make about things such as what the solution domain model is, the texture of the business rules that change the state of that model, and cross cutting concerns such as security and parallelism.

Larry: One view holds that diagrams should be very semantically meaningful: that one ought to be able to look at, say, whether a diamond is filled and know something very precise about the code. Another view holds that diagrams ought to be low-fidelity and disposable (boxes and arrows on a napkin), because it’s the textual codebase that has the final say. What’s your view?

Grady: I just spoke of this very notion at the Models 2009 conference. When Jim, Ivar, and I began our journey that became manifest in the UML, we never intended it to become a programming language. I think that there's a fairly narrow domain for which model-driven development makes sense (and Ericsson is the classic example of value, for they use the UML deeply in the creation of all their cell base station equipment) but that we should return to the roots of the UML, which was to be a language for visualizing, specifying, constructing, and documenting the artifacts of a software-intensive system—in short, a graphical language to help reason about the design of a system as it unfolds. Most diagrams should be thrown away, but there are a few that should be preserved, and in all, one should only use a graphical notation for those things that cannot easily be reasoned about in code.

As I've also often said, the code is the truth, but it is not the whole truth, and there are things such as rationale, cross-cutting concerns, and patterns that cannot easily be recovered or seen from code.... These are the things for which a graphical notation adds value, and any such notation should be used only if it has predictive power or reasoning power (meaning, you can ask questions about it).

Larry: The idea of a repository that can express a programmatic idea in a diagram or in compilable text is an old idea: Rational was working on it more than a decade ago. This year has seen some public showings of Charles Simonyi’s Intentional Workbench, which comes at the idea from a different angle. Have you seen that product and have a reaction to it?

Grady: Yes (I have seen the product) and yes (I have a reaction to it).

:-)

Actually, I've interacted with Charles since he formed Intentional (and as an aside, conducted an oral history of Charles for the Computer History Museum). I think he's got some innovative ideas.

Larry: Joel Spolsky said:

“Sometimes smart thinkers just don't know when to stop, and they create these absurd, all-encompassing, high-level pictures of the universe that are all good and fine, but don't actually mean anything at all. These are the people I call Architecture Astronauts. It's very hard to get them to write code or design programs, because they won't stop thinking about Architecture.”

He also said:

“Sometimes, you’re on a team, and you’re busy banging out the code, and somebody comes up to your desk, coffee mug in hand, and starts rattling on…And your eyes are swimming, and you have no friggin’ idea what this frigtard is talking about,….and it’s going to crash like crazy and you’re going to get paged at night to come in and try to figure it out because he’ll be at some goddamn “Design Patterns” meetup.”

Spolsky seems to represent a real constituency that is not just dismissive but outright hostile to software development approaches that are not code-centric. What do you say to people who are skeptical about the value of work products that don’t compile?

Grady: You may be surprised to hear that I'm firmly in Joel's camp. The most important artifact any development team produces is raw, running, naked code. Everything else is secondary or tertiary. However, that is not to say that these other things are inconsequential. Rather, our models, our processes, our design patterns help one to build the right thing at the right time for the right stakeholders.

Yet, while code is king, one must realize that it is also a servant, for it in the end must serve some constituency, deliver some measurable value. Just as I loathe architecture astronauts—people who have no skin in the game, people who are so divorced from the reality of executables that they melt in the sight of a line of code—I also loathe code bigots who are so blinded by their own prowess and tools that they lose sight of why or for whom they are toiling. Design for design's sake is meaningless; code for code's sake may be fun but it is also meaningless.

Recognize also that there are very real tensions between doing the right thing in the short term and doing the right thing for the long term. Code centricity tends to draw you to the former; architectual centricity tends to draw you to the latter, and honestly, neither pole is correct, but rather it is the dance between the two for which a particular team with a specific culture working in a given domain must find balance.

Larry: How big a deal for software development is the manycore era? Will it change the way we approach architecture and design?

Grady: I've said this before often as well: the average developer is not well-prepared to develop concurrent, distributed, secure software. These are all really wicked problems. Our languages have few really good primitives for dealing with intimate concurrency such as multicore processors demand, and thus we've got a bit of a conundrum. My take is that we need advances in languages, in compilers, in patterns (such as Intel's concurrency patterns) and platforms (such as Apple's Grand Central Dispatch) to raise the level of abstraction.

Larry: What programming languages and technologies are you enjoying right now?

Grady: Yes, I still program, and I use mainly Java and PHP. Eclipse is my development platform of choice.

Larry: You don’t really think that Maui coffee comes anywhere near approaching the subtle splendor of 100% Kona coffee, do you?

Grady: It's not even close; I am addicted to Kona coffee.