Let me begin with a case in point. More than a decade ago, I and the other future AgileTek co-founders received a functional specification from a large ($13 billion today) consumer products company. It was not an overly complex system, but the functional specification ran to more than 400 pages. The painstaking detail of the document was impressive; every detail of the user interface, validation rules, and exactly how everything was to work was all spelled out. We got the job.
While the software was intended for use by the field sales force, our customer was the information technology (IT) organization. We suggested that perhaps it would be wise for our development team to sit down with some of the intended users and review the specifications. We were told that the IT folks had already done that and, moreover, the effort had taken up more of the users' time than they wanted to give; there was no need for any further review. All we needed to do was to build and test the software to spec - what we call spec conversion in our business.
What seemed like a straightforward task of turning the specifications into bits and bytes got complicated when we discovered that what it said on page 83 contradicted what it said on page 183 and so forth. Could we have possibly analyzed, absorbed, and understood those 400 pages well enough to catch such problems before we began? Of course not. More importantly, do you think that anyone in the sales force really analyzed, absorbed, and understood those 400 pages even though they approved the specification? Most assuredly not! Their eyes probably glazed over around page 20, and they had no choice but to approve a specification they neither had the time nor skill to understand.
This is, in a nutshell, why I've switched bDistributed.com away from doing RFP (Request for Proposal) work to working on a time-and-materials basis using an iterative agile methodology. I've had people tell me that "the industry has been able to do spec work for 30 years" but the fact of the matter is that the industry has been trying and failing to do it for 30 years.
The only way to really succeed at spec work is to significantly pad both your time and cost estimates to build yourself enough of a buffer to handle all of the unforseen circumstances. In today's economy this is a losing proposition; too many competitors are too hungry to be honestly realistic with their estimates and wind up lowballing their bids to get work.
I think it's far better for everybody involved — the client, the service provider, and the eventual end users — to use an iterative, agile approach to software development. More details on the approach I've chosen are in an article I wrote late last year for my company's web site, Story-Driven Planning: Delivering Business Value Fast.
In a nutshell, the client determines what features they want in the form of "user stories" that each will take less than a single one-to-three-week iteration to complete. (Iteration length is fixed and chosen up-front; two-week iterations are the norm though one-week iterations work on smaller projects.) The developers estimate the user stories in terms of abstract units of effort (points), and estimate how many points they can do in an iteration. The client prioritizes the user stories and the developers work on them to completion. At the end of an iteration, the number of completed stories (in terms of points) is used to determine how many points' worth can be chosen for the next iteration. And at the beginning of the next iteration, the client gets to add new stories, remove existing stories, and re-prioritize existing stories.
This means the client is always in charge of what's being worked on, and gets to have the features that are most important to them up front. And since user stories are always worked through to completion they're ready to ship at the end of an iteration — not "implemented but needing QA."
