The poop has hit the propeller for Microsoft over the past few days in regards to its Guidelines for Test-Driven Development article published on its Microsoft Developer Network portal (MSDN).

Test-Driven Development (TDD) is a well-defined software design method that focuses on facilitating testability through loose coupling. The loose coupling arrived at by TDD fosters better class factoring and subsequently enables vastly improved opportunities for reuse harvesting. In addition, Test-Driven Development teaches software developers who are relatively weak in object-oriented (OO) programming and design how to use OO techniques and provides clear guidance for the employment of design patterns. I can’t imagine another method that offers as much to software developers as TDD does in terms of effective design, extraordinary increases in software quality, and OO learning.

On Test-Driven Development and Agile, Microsoft seems to have has lost its way, lost its courage, and lost its ability to innovate. By missing the Agile boat, Microsoft is missing a key opportunity to transition millions of legacy developers to the .NET platform through the enhanced OO learning offered to developers who engage in Agile development practices. Microsoft is missing even more opportunities to rapidly adapt its software development tools to emerging requirements in contemporary software development because Microsoft isn’t yet harnessing the power of the adaptive design and delivery enjoyed by those developers and shops that engage in Agile software development.

Faced with a critical decision of innovation over stagnation, Microsoft has cowered into the old familiar waterfall approaches that ensure that its customers are capable only of repeatedly investing money hand over fist into efforts that produce utterly disposable software. The practice guidance for Visual Studio 2005 reeks of outdated software development practices, and the Guidelines for TDD article is a shining example of how smelly things can really get.

A clear distinction needs to be drawn between mere test-first programming and Test-Driven Development. This is a point that is largely lost on most developers new to TDD – and by “developers new to TDD” I mean those for whom the TDD lights still haven’t come on. For me this period of darkness lasted for over a year after I had read Kent Beck’s book on the subject. For over a year, I had assumed that I knew what TDD is and I had assumed that I was practicing it. Imagine my surprise when I realized that there was more – much more – to TDD than I had assumed.

Reading the main Test-Driven Development forum on Yahoo Groups, there was a lot of talk on a lot of esoteric software stuff that was way over my head. Originally, I thought that all this esoteric stuff was just stuff that applied to Java developers, since at the time, most of the TDD people were Java developers along with a smattering of people using C++. Over time, it became apparent to me that these esoteric issues in software design were essential aspects of object oriented design and are equally applicable to .NET, and it was through TDD that I learned that I had totally overestimated my knowledge of OO.

It was also through TDD practice that I learned one heck of a lot about OO, design patterns, component design, and application design and architecture. TDD practice puts learning in high gear and keeps it there. Of course, if you don’t think that learning is part of your job as a software developer, then TDD is probably not for you. And this might be the message that Microsoft is sending out regarding the kind of software people that far too many .NET developers have devolved into.

Rather than shape up Visual Studio to be amenable to Test-Driven Development, and in its ignorance of the nature of Test-Driven Development, Microsoft’s marketing people have simply tried to snow job the developer community into swallowing a wholly different definition of Test-Driven Development. And the very text outlining Microsoft’s understanding of Test-Driven Development betrays its almost complete ignorance of why Test-Driven Development is effective and why it is practiced in the way that it is.

So, this is for the folks at MSDN. Listen up… you’ve already embarrassed yourself once, and you’ve embarrassed your insider and MVP communities by association this time as well. So here’s more of the skinny on TDD:

TDD is a software design methodology. It’s not a software testing methodology. As such, a guidelines for TDD document doesn’t belong in the MSDN content tree for VSTS for Testers – it’s a topic for developers. TDD arrives at its design values by considering client code first before considering server code. It uses tests to act as clients because it’s vastly cheaper and faster to code a test method than to build an entire visual application to incrementally implement and validate a design intention. Because tests are used as client code in the process of client-driven design, we arrive at the end of any given development cycle with tests in hand. And in using tests as clients, we learn a lot more about the needs of testability, and how to factor our class libraries so that internals that would traditionally be hidden way down in the dark, buried branches of code are surfaced through design patterns and exposed directly to tests. This in turn leads to more simple test classes and thus to greatly facilitated maintenance. The smaller class factoring and the structural decoupling of these smaller classes via interfaces (and sometimes abstract classes) allow these assets to be more easily harvested for reuse. And the emergence of the justification for OO practices like polymorphism and adaptation demonstrates to even the most novice OO developer where and why to use design patterns, and which design patterns to use.

Oh yeah, and because we base our tests on a well-known test frameworks like those in the xUnit family (like VSTS), we also use our design scaffolding as a regression test suite, further ensuring the sustainability of our codebases.

Testing is a side effect. Design is the goal of TDD, design for testability flows naturally from there, followed by de-coupling, and reusability. TDD is a king among contemporary software design practices. It delivers on so many measures of software development goodness.

You simply cannot achieve the aims of Test-Driven Development if you start with sweeping assumptions of the structure of classes before you set down the kind of code that incrementally validates your assumptions.

If a developer leaps to conclusions about design of a class or family of classes, he will have missed the optimal design of the class. As the implementation of a class deviates from its inherent optimal design, implementation and maintenance costs for that class increase exponentially as the effort to cajole misshapen classes together into a system of misshapen classes increases. This increase in cost an effort is compounded by the number of classes in a system. Most codebases can’t sustain this pressure for too long and they are often disposed of and re-written entirely from scratch – using the exact same development methodologies that caused them to degenerate into entropy to begin with.

In TDD, you start with an understanding of what you need because scenario-based customer specification through stories or use cases tells you what you need. You write down a set of things that the software should do to satisfy a story, and you have the cajones to admit to yourself these things that you have written down are purely conjecture until you get them into code. We know that these things are conjecture because every developer learns about what he is coding from the first few lines of code he writes. As he writes more code, more is learned and previous assumptions are either ratified or disqualified and the code is morphed to address new understandings. We don’t waste time on detailed design using any tool other than code because code is ultimately the only thing that can validate detailed design. Committing to the pursuit of detailed design in anything other than code leads to waste. Those design artifacts are ultimately invalidated by code and the artifacts are abandoned without ever having delivered sufficient ROI.

To quote Mary Poppendieck, “Design is a knowledge-generating process,” and because knowledge is continually generated during coding, then coding, as Jack Reeves pointed out long ago, is a design activity.

Coders are making design decisions with every method they implement. They do design whether they use Test-Driven Development approaches or not. Evolutionary design is the way that software is created. It’s endemic to the medium. Software development is unlike any other kind of construction activity in this regard, and software development is like most forms of product development in this regard.

TDD simply supports the reality of software creation with a process and a set of techniques that are wholly optimized for the reality of creating software. Visual Studio on the other hand, is optimized for the reality that customers are still willing to believe in material solutions to problems that are essentially intellectual in nature, and Microsoft marketing folks are right there to sell snake oil to the gullible.

Should Microsoft be held responsible for selling lack-luster products into a market that is snuggly snoozing in a profound yet comforting sleep of epistemological ignorance? It damn well should be held responsible if it is not only going to sell software development tools, but also prop itself on a high horse and hold itself up in the community as a provider of software development practice guidance. If Microsoft cannot be trusted to do both and to do so with deep convictions to truth in advertising, then it needs to return to being merely a manufacturing company.

Ironically, the Agile development approaches that Microsoft so obviously eschews would have provided the company with the techniques that would have enabled it to provide support for Test-Driven Development in its own software development tools. Without the influence of Agile development techniques, Microsoft is doomed to continue to engage in building massive, monolithic, tightly-coupled products like Visual Studio that cannot be quickly adapted to meet emerging market pressures.

In the case of Test-Driven Development, the unit testing tools available in VSTS fall short of the mark, and leave the NUnit framework and Jamie Cansdale’s Test-Driven.NET plugin for Visual Studio 2003 and 2005 to rule the roost as the leaders in tooling for Test-Driven Development for Visual Studio. That’s not to say that Test-Driven Development isn’t an interest for Microsoft. However, Microsoft’s lack of foresight in identifying the need to support TDD in Visual Studio early enough in its Visual Studio 2005 product development cycle and its lack of agility in adapting to emerging requirements for tools to provide direct support for TDD once it had become aware of the need to support TDD have conspired to leave Microsoft little recourse at this late date.

In fact, members of Microsoft’s insider community repeatedly asked for support for Test-Driven Development in Visual Studio 2005, and Microsoft was repeatedly publicly criticized for ignoring TDD requirements throughout its long beta and release candidate phase leading up to the RTM release of Visual Studio 2005 in early November. Microsoft is so utterly un-agile, that it isn’t even able to respond to customer requirements that it had known about for a year and a half. And this is a company that purports to provide Agile development guidance to the software development community at large?

Ultimately, Microsoft caught on to the need to have a Test-Driven Development story for Visual Studio 2005 when it had already become too late for them to respond to the requirements. Well, too late for waterfall, anyway. So, at some point in recent history, Microsoft marketing folks made a bet that if they simply redefined Test-Driven Development so that it matched the existing tooling provided by Visual Studio 2005, they would be able to claim support for tooling requirements for contemporary software development techniques and approaches without actually having to worry about actually having that support in their tools.

Going with the assumption that the software development community at large is as ignorant and out-of-touch as the folks in Microsoft developer tools marketing in regards to contemporary software development, they simply posted the atrocious “Guidelines for Test-Driven Development” on MSDN.

Guidance provided in this article fundamentally contradicts some of the most essential and elemental practices and goals of TDD. If this guidance were to be put into practice, the practitioners would never glean the benefits promised by TDD. Because the guidance commits to neither waterfall nor to TDD, its fragmented nature almost guarantees that a practitioner of these guidelines be doomed to software hell.

In waterfall-style development, a developer starts with sweeping assumptions about a class’s design and might capture them in a visual documentation tool like a class designer. If the developer were test-oriented (not Test-Driven mind you, but test-oriented), he might then code some test stubs. If he were using a powerful tool, the tool might generate the tests based on the code generated from the class design.

However, in doing so he would lock in his design at a point that is far too early in the development process. Wrapping so much test harness around so many non-validated design assumptions would likely cause those design assumptions to be fixed into place. Even if the developer had wanted to incorporate new knowledge into the design, there would be so many dependencies at such an early stage in the development that he might just as soon live with the junk he had just created.

This fixing of design assumptions in place too early in the process is one of the weaknesses of waterfall that TDD addresses quite effectively by providing practices that reinforce a very granular approach to incremental design and implementation. The approach is granular enough to allow understandings about the code under development to be taken into consideration immediately when these understandings emerge.

Microsoft’s guidelines specifically recommend an approach that ensures that the failures of traditional software development will be upheld going forward into the age of VSTS:

Step 6. Define the interfaces and classes for your feature or requirement. You can add a minimum of code, just enough to compile. Consider using the Class Designer to follow this step. Step 7. Generate tests from your interfaces and classes.

A more ignorant approach to TDD would have been hard to come up with. These steps in Microsoft’s TDD guidance essentially describe nothing short of an anti-TDD process, and the whole document has the nerve to package up the guidance as factual TDD guidance. It’s truly a shameful example of the worst product marketing behavior.

Microsoft has so overestimated its own understanding of TDD and the developer community that it has left itself open to criticism from some of the most notable figures in Agile development.

Robert Martin has this to say:

Microsoft should pull these guidelines and try to figure out some way that the tool might actually support TDD. Redefining TDD as some kind of waterfall, just because their tool supports it that way, is not very helpful to the industry; and most programmers are all to clearly aware of this. If Microsoft would like to win the hearts and minds of more developers, they’d better try to figure out where the industry is actually going, rather than trying to drag the industry into it’s tool. IMHO they should look very carefully at Eclipse, and IntelliJ — especially IntelliJ. Bottom line: We don’t need tools that help us do waterfall better. We don’t need vacuous process guidlines with 14 linear steps describing something that is not TDD while claiming otherwise. We don’t need to be force fed Visual Studio. What we need from Microsoft is for them to stop talking and listen.

And from Michael Feathers:

The style of TDD described in the guidelines would have us jump ahead and write five, ten, maybe twenty test cases before getting the first one to pass. You can do that, but it’s like putting on a set of blinders. When you don’t have to formulate the next test case right after passing the last one, there isn’t much chance to think about improvements. Worse, there is a disincentive to thinking about them: if you find any, you have to delete all of the speculation you’ve hardcoded in the tests and interfaces you created in advance. Right now, I hope that MS revisits their guidelines. Yes, they have a tool that generates test stubs, but using it for TDD looks counter-productive. Adding stubs to legacy code? That would be great, but getting in the way of the feedback cycle? Bad. Bad. Bad.

From Ron Jeffries:

The original article, at msdn2.microsoft.com/en-us/library/ms182521.aspx, describes TDD as a process where you get all your requirements understood, then define all your tests, then design an object’s full interface, implement the tests, and make them work. It then goes on to explain how all these cool features in VSTS help with this “TDD” process. The features might be good, but the process described is not TDD, nor a reasonable variation thereof. Test-Driven Development was clearly defined by Kent Beck, and has been described by others, such as Dave Astels, and even myself. It is a process where the tests are written one at a time (though one might make note of some possible tests for the future), and the tests are used to help define the design and develop the code. The Microsoft version of TDD is indistinguishable from a single-object waterfall model, to a first approximation. Someone more paranoid than I am might conclude that Microsoft is intentionally trying to co-opt a perfectly good practice, pervert it, and tie it into mandatory use of their tools, which are, at this writing, about half as good as what’s available in Open Source and commercial plug-ins for Visual Studio. I prefer to assume that it’s just ignorance on their part, but I’m prepared to change my mind. The article is shameful. If it isn’t intentionally malicious, it is at least ignorant and ill-conceived. It should be taken down and replaced with something that correctly reflects industry usage of the TDD terminology.

From Sam Gentile:

The Microsoft Guidelines for TDD are creating quite a storm of protest from people in the Agile community for very good reason. They don’t at all describe Test-Driven Development (TDD). They got it all wrong.

From Jeremy Miller:

No, no, no. Autogeneration of the unit tests from interfaces or classes might sound nice, but it isn’t really correct and might not be all that useful. It’s also not Test Driven Development. TDD is using unit tests to drive the design of the code. One of the “crossing the Rubicon” moments in learning TDD is when you learn to define the interface of a class inside the unit test. That’s right, use the unit test to define the intended function of a new piece of code, and then write the code to make the unit test compile and pass (ReSharper will very happily create the method stub for you). I still do some paper and pencil work before coding, but by and large I’ve found that I have better results when you are determining a class’s signature while writing the unit tests. Writing unit tests first will inevitably simplify the usage of a class (because you’ll want to minimize the mechanics of creating the unit tests). I certainly don’t think that you throw away traditional design techniques when you do TDD, but you can’t ever let the code and design get very far ahead of the unit tests. Another thing to keep in mind with the autogeneration of unit tests is that unit tests do not map one to one with methods and properties. You don’t test methods one by one. You test logical pathways through the code. One of the key things to success in TDD is to very deliberately design for testability. The best way to accomplish testability is to write code “Test First”. If you design and create a lot of code before you switch to retroactively applying unit tests you’ll often find that the unit tests are hard to write. Also, you will probably not achieve the same level of code coverage that you would if you code test first. Actually, let me put this more strongly – retrofitting existing code with unit tests is hard. It’s perfectly acceptable to spend some time contemplating the design before coding, but do write the unit tests first. TDD will go so much better when you write the tests first. My single biggest irritation with Microsoft’s guidance to .Net developers and their development tools is Microsoft often seems to ignore anything outside of the Redmond campus. The advice they’ve given is contrary to existing best practices for TDD that have resulted from experience. Patterns and Practices team, are you out there? I know you guys are using Agile methods because I read your blogs. Do y’all really buy off on the MS TDD “recommendations?” Can you do something about this?

A number of TDD and Agile community folks have chimed in on this issue on the home base of TDD community at Yahoo Groups, ultimately driving curious traffic to the guidelines document. Overwhelmingly, people who have taken the time to rate the document have given it the lowest possible rating. As Michael Feathers pointed out about the article’s rating on the TDD forum, it’s a “shame it’s asymptotic to 1.0.”

On a related note, I will be teaching a free, all-day workshop on Test-Driven Development in Houston on Tuesday, November 29th. Ironically, the workshop will be held at The Microsoft Houston offices, but not to worry, I’ll be presenting actual TDD, not the MSDN stuff.