The Agile Manifesto is an immune response on the part of programmers to bad management. The document is an expression of trauma, and its intellectual descendants continue to carry this baggage. While the Agile era has brought about remarkable advancements in project management techniques and development tools, it remains a tactical, technical, and ultimately reactionary movement. As long as Agile remains in this position it will be liable to backfire, vulnerable to the very depredations of bad management it had initially evolved to counter. In order to begin to heal, it is important to place Agile in a wider context. Many of the substantive ideas associated with Agile predate it by between about 20 and 30 years. This is not an accusation of plagiarism; rather it is an assertion that there are idiosyncrasies of software development that are invariant even as technique and technology improves, and so you are bound to recapitulate these patterns eventually. The patterns include, but are not necessarily limited to: Incremental development Software is created in modules: Just as books are composed of chapters, which are composed of sections , paragraphs, and sentences, which themselves—in code as in text—are laid out word by word. Basic structures cannot be composed into larger structures unless they themselves are internally consistent, and so the creation of software, just like any artifact of language, is naturally incremental. Iterative development Not to be confused with incremental development. Software is a very verbose, very precise incantation of how an information system ought to behave. The entire process reduces to gaining information about the information system and gelling it into formal language—a process that will never naturally terminate if you continue to provide it with resources: you can always go back and refine what you have written, and the need for some revisitation over time is the rule. Cross-functional teams Because software development reduces to gathering and concentrating information, it is obvious that that information will come from various sources, and the people responsible for said gathering and concentrating will have different expertise. Also, programming itself is a quasi-solipsistic activity. A programmer requires, strictly speaking , no more collaboration than does a novelist or painter. Because software development is naturally incremental, people can work on different parts of a system in parallel. This naturally lends itself to variegated expertise among programmers themselves. Fixed time, variable scope Just like content development in any other medium, software development reduces to the elimination of entropy. Unlike other media, however, there is very little else to count. The problem is that you and/or your team can only eliminate a fixed-ish number of bits of entropy per unit time , and at the outset you don't know how many bits there are in the problem to eliminate. So you say we're gonna work this much, and whatever comes out the other end of that process is whatever you get. In the current theory, and sometimes in practice, this is how sprints are supposed to work, and the material that doesn't fit is accumulated into some backlog or other. User involvement Since the purpose of software is putatively to serve users, it should not be surprising that users constitute an essential source of information, up to and including their interactions with prototypes and as paying customers of what has come to be called a minimum viable product . To the extent that trauma is by definition something you don't get over , and only eventually come to understand , I submit that the Agile-o-sphere continually relitigates these issues at the expense of solving other, potentially more fruitful problems.

What We Hear Suspiciously Much About Collaboration! So much effort goes into writing and talking about collaboration, and creating tools to facilitate collaboration and telecollaboration, with the tacit assumption that more collaboration is always better. To quote Frederick Brooks, the more collaboration the better is far from a self-evident proposition and certainly not universally true. True indeed, to the extent that collaboration divides labour, but questionable as a fraction of one's activity. Since communication overhead increases proportionally the square of the number of people on the team—a fact illuminated by Brooks in the 1970s—what you actually want is as little collaboration as you can get away with. One way all this lavish attention to collaboration makes sense, is that your team dynamic is something you can always affect, even if you have no meaningful influence over the major strategic decisions of an organization.

What We Don't Hear A Lot About I'm sure these things exist in some form or other; the point is they don't get nearly as much attention. Anything higher up the chain than project management In 20 years, Agile has given us standup meetings, sprints, pair programming, user stories, DevOps, and continuous integration. This is all tactical stuff. Contracting gets a little attention, but nothing like the attention heaped onto project management on down. Where are the people thinking and talking about Agile procurement, for example? Or Agile enterprise resource planning? This is important, because failures to implement Agile principles are often easily diagnosed as side effects of the constraints of antiquated procurement and contracting practices. A specificity gradient Software is unprecedented in its low cost of development—when compared to hardware. Code, however, is arguably the most expensive medium for expressing ideas. The popular meme of maturing a product from skateboard, to bike, to motorcycle, to car is a cute story, but the way software tends to actually be made is more like going from engine, to drivetrain, to monocoque, to interior. Except software isn't like a car at all: if anything it's more like a university campus, where different buildings are complete artifacts in their own right but loosely couple together to form a unified service. It is perfectly reasonable for some parts to be undergoing construction while others are being planned. Taken as a whole at any given moment, some parts of the system will have more detail and others will have less. Our notions of iteration and incrementality therefore have to also make room for media other than code. Conceptual friggin' integrity The one idea from the 1970s most conspicuously absent from Agile discourse is conceptual integrity . This—another contribution from Brooks—is roughly the state of having a unified mental model of both the project and the user, shared among all members of the team. Conceptual integrity makes the product both easier to develop and easier to use, because this integrity is communicated to both the development team and the user, through the product. Without conceptual integrity, Brooks said, there will be as many mental models as there are people on the team. This state of affairs requires somebody to have the final say on strategic decisions. It furthermore requires this person to have diverse enough expertise to mentally circumscribe—and thus have a vision for—the entire project in every way that was important, even if not precisely down to the last line of code.

Agile vs. Waterfall is Kayfabe Any Agile adherent can tell you that Waterfall is bad, and that if you aren't doing Agile, you must necessarily be doing Waterfall. It is clear, however, that comparatively few have actually read the 1970 Waterfall paper. If they had, they would learn that what they had experienced—or more likely, heard about—was what the paper's author asserted was the wrong way to manage a software project. One may even be led to interrogate just why it is that such a known-bad practice is what developers ended up using for so many years. There is the story that so-called big design up front —another Agile shibboleth—was more of a necessity in the past because: available computing resources were a minuscule fraction of what they are today, and thus required more planning,

programming was just plain slower, and

distribution logistics—shipping on physical tapes, floppy disks, ROM cartridges, CDs—put a cap on the release cycle. There is furthermore the argument that the product in question must get to market as quickly as possible to secure the first-mover advantage, or perhaps in its gentler form, the idea has to be tested in the market right away to make sure it's something people want. None of these statements are false, but they aren't the whole truth either. In particular, not all software is a Web app. Indeed, a sizable fraction of software is not a consumer product, or not even a product at all. We discount the volume of, for instance, embedded systems and one-off infrastructure. The former have a release cycle pinned to their host hardware; the latter you can update whenever it's sufficiently convenient. Both categories of software existed 50 years ago as they do today, and will continue to exist into the foreseeable future: In 2020 there are plenty of systems that are effectively un-updateable; in 1970 there were plenty of systems where you could inject new code straight into a running program. The story that things were slow and now they're fast—technology and market forces alike—is not sufficient to explain the pervasive adoption of what has come to be called Waterfall. The story is especially ill-fitting considering preeminent figures in software development had been advocating incremental and iterative development with feedback from users, up to and including the original 1970 Waterfall paper itself. I want you to consider instead the possibility that Waterfall came to exist, and continues to exist, for the convenience of managers: people whose methods are inherited from military and civil engineering, and who, more than anything else, need you to promise them something specific, and then deliver exactly what you promised them, when you promised you'd deliver it. There exists many a corner office whose occupant, if forced to choose, will take an absence of surprises over a substantive outcome. This alternative theory would further explain the RFPs that go out that insist on Agile processes while also demanding a formal specification up front with a detailed prescription of feature milestones—or as they like to call them, phases —and their concomitant deadlines. My final remark on this subject is that the rhetorical framing around design as something inherently belonging to Waterfall, and thus bad and an impediment to shipping product. This is little more than a goad, perpetrated not so much by professional managers, but anxious startup founders worried about their money running out before they start earning any. Software development, again, is about answering thousands of questions. Some of those questions can only be answered with code; others can never be. There is nothing preventing these operations from being carried out in parallel, except when one genuinely depends on the completion of the other.

Goodhart's Curse A feature is a unit of programmer output that corresponds roughly to a subroutine . Features work as a management control because they manifest in the final product, and their relationship to programmer-hours is about as close to linear as you're going to get. Features also work for marketing. Since nobody can deny a feature exists, a convenient tactic is just to count them: Over 100 new features in our latest release! Any time, likewise, the messaging reads something like our product lets you… , they're talking about features. Indeed, marketing departments everywhere would be adrift without the venerable feature to fall back on. Features don't work, in the sense that they can be easily gamed. A brittle and perfunctory implementation, done quickly, is going to score more intramural brownie points over a robust and complete one. If the question is does product A have feature X ? then the answer is yes either way. This also makes features a spurious basis for comparison in competing products because you need to seriously examine them to determine the extent to which they are any good. We use the term feature factory as a pejorative to designate companies addicted to adding features, while accumulating incalculable so-called technical debt . This situation is driven by management for the convenience of marketing, and I am skeptical that a more faithful application of Agile principles will correct it. Indeed, I suspect Agile processes are constitutionally vulnerable to this kind of compromise. There is, however, another objectively countable phenomenon associated with software development, and that is behaviour. The question does the product do X , optionally under condition Y ? is also an empirical question with a yes-or-no answer. Programmers should be familiar with this pattern; it is exactly how test suites are written. Behaviour has an advantage over features in that you can describe any feature in terms of behaviour, but you can't describe behaviour in terms of features. This is because features are visible while the software is sitting still, whereas behaviour is only visible while the software is running. Moreover, the presence of a feature can only indicate to a user if a goal is possible, behaviour will determine how painful it will be to achieve it. The final advantage of behaviour that I will discuss here is the fact that it blurs the line between fixing bugs and building features , and coalesces the two into a unitary process of sculpting behaviour . The feature count may not go up as quickly, but the product's behaviour will increase steadily in subtlety and fit. My bet is also that behaviour is less amenable to be gamed because any attempt is going to look really obvious and silly. The marketing department can fend for themselves mining the corpus for identifiable features; they pretty much do that already anyway.