If you are going to be any good at managing software projects, you have to learn more about what works, and what doesn’t, in project management. A research paper published by the Standish Group (available summarized here) indicates that in 2009, 44% of projects where delivered late, 24% failed, and only 32% of projects were delivered on time and within budget. The same report indicates in 1994, the average project schedule overrun was about 120% and the average cost overrun was about 100%. No doubt that to estimate a software project accurately is a challenge for all organizations.

An accurate estimate does not only help for a better budgeting, but also come with additional benefits. When referencing the book Software Estimation from Steve McConnell there are some obvious pointers you should focus on to improve your project accuracy:

Better planning – A project plan is often setup based on the estimate. If the estimate is accurate and realistic, the project plan will be more useful for progress tracking. Also you will have a more accurate view of resource availability.

Better task completion in terms of documentations, testings, training, etc – If the estimate is not accurate, resources and time will be used up on programming and these other tasks’ schedules will slip or even be omitted.

Better quality – More accurate estimate, better schedule, and less pressure to finish things within unrealistic time generate a higher quality

One thing you have to understand to provide better project management is the “Cone of Uncertainty”.

Introduction to the Cone

Early in a software project, specific details of the nature of the software to be built, details of specific requirements, details of the proposed solution, written project plan, personnel staffing including available resources, and other project variables are usually unclear. The variability in these factors contributes variability to project estimates — an accurate estimate of a variable phenomenon must include the variability in the phenomenon itself. As these sources of variabiility are further investigated and pinned down, the variability in the project diminishes, and so the variability in the project estimates can also diminish. This phenomenon is known as the “Cone of Uncertainty” which is illustrated in the following figure. As the figure suggests, significant narrowing of the Cone occur during the first 20-30% of the total calendar time for the project.

Figure 1: The Cone of Uncertainty

The horizontal axis contains common project milestones such as Initial Concept, Approved Product Definition, Requirements Complete, and so on. Because of its origins, this terminology sounds somewhat product oriented. “Product Definition” just refers to the agreed upon vision for the software, or “software concept,” and applies equally to web services, internal business systems, and most other kinds of software projects.

The vertical axis contains the degree of error that has been found in estimates created by skilled estimators at various points in the project. The estimates could be for how much a particular feature set will cost and how much effort will be required to deliver that feature set, or it could be for how many features can be delivered for a particular amount of effort or schedule. This description uses the generic term “scope” to refer to project size in effort, cost, features, or some combination.

As you can see from Figure 1, estimates created very early in the project are subject to a high degree of error. Estimates created at Initial Concept time can be inaccurate by a factor of 4x on the high side or 4x on the low side (also expressed as 0.25x, which is just 1 divided by 4). The total range from high estimate to low estimate is 4x divided by 0.25x, or 16x.

Narrowing the Cone

One question that managers and customers ask is, “If I give you another week to work on your estimate, can you refine it so that it contains less uncertainty?” That’s a reasonable request, but unfortunately it’s not possible to deliver on that request. Research has found that the accuracy of the software estimate depends on the level of refinement of the software’s definition. The more refined the definition, the more accurate the estimate. The reason the estimate contains variability is that the software project itself contains variability. The only way to reduce the variability in the estimate is to reduce the variability in the project itself. Specifics reduce uncertainty. An important—and difficult—concept is that the Cone of Uncertainty represents the best case accuracy it’s possible to have in software estimates at different points in a project. The Cone represents the error in estimates created by skilled estimators. It’s easily possible to do worse. It isn’t possible to be more accurate; it’s only possible to be lucky. Another way in which the Cone represents a best case estimate is that, if the project is not well controlled, or if the estimators aren’t very skilled, estimates can fail to improve as shown by the Cone. Figure 2 shows what happens when the project isn’t conducted in a way that reduces variability—the uncertainty isn’t a Cone, but rather a Cloud that persists to the end of the project. The issue isn’t really that the estimates don’t converge; the issue is that the project itself doesn’t converge, that is, it doesn’t drive out enough variability to support more accurate estimates.

Figure 2: The Cloud of Uncertainty

The Cone narrows only as you make decisions that eliminate variability. As Figure 3 illustrates, defining the product vision (including committing to what you will not do), reduces variability. Defining requirements—again, including what you are not going to do—eliminates variability further. Designing the user interface helps to reduce the risk of variability arising from misunderstood requirements. Of course, if the product isn’t really defined, or if the Product Definition gets redefined later, then the Cone will get wider, and estimation accuracy will be poorer.

Figure 3: Forcing the Cone of Uncertainty to Narrow

Relationship Between the Cone of Uncertainty and Commitment