Styles

Saturday, May 15, 2010

Using simulation software for better project estimation

Most technology professionals that I've encountered have difficulty in creating accurate estimates.  Unclear requirements, differences in personnel skill, feature difficulty, and scope creep can all contribute to uncertainty in estimates.  There are, of course, different ways of getting around these problems, such as using pre-determined formulas, padding estimates, making frequent adjustments, etc.  It's about time that I.T. professionals think about adding simulation tools to their tool belts.


(In interests of full disclosure, I'm writing this article with Palisade's @RISK software in mind, since that's the software I used in my Quantitative Analysis course.  I'm reasonably certain that there are similar software packages out there.)


How would these simulators work for estimating an I.T. project?  First, you should break down your effort into smaller, manageable components.  For the sake of example, let's say that you need to add a new feature to an existing web site.  You know that there will be 5 new pages, 7 new tables, and some business logic.  However, the business logic is extremely difficult and one of the pages will require quite a bit of extra work, perhaps using an unfamiliar technology.  You break it down, and your estimate might look like this:
  1. UI for 4 simple pages: 2 hours each (+/- 1 hour)
  2. 7 database tables: 1 hour each (30 minutes minimum, 2 hours maximum)
  3. Complex UI for one page: 8 hours (could be 6, but it could explode to 16)
  4. Business logic: 16 hours (but for whatever reason, you really don't know)
You could enter these assumptions into the software (#4 would likely be done using a normal distribution) and it would tell you the average amount of time, as well as the best- and worst-case scenarios.  Before the software, our range of possibilities would be from 20 to 70 hours.  With the software, we can reasonably say that the effort will likely take somewhere between 30 and 50 hours.


This was a relatively simple example for demonstration purposes.  I can't speak for all risk analysis software, but @RISK makes it relatively easy for a programmer to add some pretty complex logic into your scenario.  You could pretty easily plan for uncertainties due to uncertain staffing, add provisions for scope creep, or add profits (if you're a consulting firm) to you model.  The sky is the limit.


While the software can be useful, it does not protect you from errors from false assumptions.  In the example above, I didn't add provisions for getting the data from the database into the business logic layer, nor did I add a provision for testing.  If these components weren't already built into the estimate, your estimate is going to be low. Therefore, approach the information you get back from the system cautiously.


If you're interested in reading more about the subject, I would suggest the textbook I learned from, which is Data Analysis and Decision Making by Albright, Winston, and Zappe.  The examples they give are not I.T. specific, but it shouldn't be hard to apply those concepts to your estimation process.  If nothing else, you can continue to watch this blog since I plan on continuing to post about the subject from time to time.

1 comment:

  1. An excellent idea. I have tried pushing the use of modeling tools like @RISK at work since I used it in quantitative analysis, but haven't had anyone interested. Our programming department is numerically oriented enough, and desperately needs help in this area. They might just take to the idea.

    ReplyDelete