THE PELICAN WEB
WORK SAMPLE

SOFTWARE PROCESS IMPROVEMENT

In software development, "change is the name of the game". Change management is crucial, and this applies to all kinds of software development, regardless of programming methods, application area, or infrastructure tools. Specifically, it also applies to embedded software. Three best practices are presented here for change management: timelines (Gantt charts), relational databases for end-to-end traceability, and stability metrics. These simple techniques are not sufficient to ensure the production of high quality software at reasonable cost and in a reasonable amount of time, but they are necessary.

OUTLINE

  • The urgent need for improving software quality and software reliability
  • Myth 1: "Software is bad because development is driven by delivery schedule"
  • Myth 2: "Software is bad because requirements keep changing all the time"
  • Myth 3: "Software is bad because there are no practical metrics to for improvement"
  • Summary and recommendations

SYSTEM DYNAMICS ANALYSIS FOR SOFTWARE PROCESS IMPROVEMENT

The urgent need for improving software quality and software reliability

The cost of bad software is in the billions and increasing rapidly. At a time when millions are dying of hunger every day, this is outrageous. And there are other considerations, such as safety and security. It is time to dispel some myths.

""Every other industry would consider it an extremely unethical practice to pass on to the customer products of the levels of quality found in software products." L.N. Rajaram, The Watts Humphrey Software Quality Institute, Chennai, India, 2005 (Foreward to "The Art of Creative Destruction," by Rajnikant Puranik, Shroff Publishers, 2005)

List of Acronyms

CPI
CUT
CR
DCR
FVT
HLD
KCSI
LLD
PTR
PVT
SCM
SDP
SPI
SQP
SR
SRS
SVT
Continuous Process Improvement
Code and Unit Test
Customer Problem Report
Design Change Request
Functional Verification Test (i.e., Software Integration Tests)
High Level Design (Architecture and Design)
Thousands of New or Changed Source Instructions
Low Level Design (HLD and LLD increasingly coming together)
Problem Tracking Report
Product Verification Test (i.e., Total Build Integration Test)
Software Configuration Management
Software Development Plan
Software Process Improvement
Software Quality Plan
Software Requirement
Software Requirements Specification (HLD + LLD)
System Verification Test (i.e., Target Environment)

Myth 1: "Software is bad because development is driven by delivery schedule"

The real problems are the limitations of open-loop planning and the avoidance of continuous replanning. That high quality software cannot be developed rapidly is a myth. Some of the best software has been developed under tight time and budget constraints. The best practice is to start with a software development plan, something like:

[sdp1]

Refinement in understanding of the requirements starts when design and code activities provide feedback pursuant to reduce ambiguities remaining in the initially defined requirements. Derived requirements may also emerge that could not have been anticipated at project start. And, inevitably, changes (additions, deletions, revisions) in requirements also come at any time from the customer. Sometimes it becomes a matter of continuous replanning:

[sdp2]

Feedback from testing (in the form of problems found) precipitates the need for further adjustments and replanning during the tail end of the project. This often results in a house full of worms, and the project timeline plan begins to look like a plate of spaghetti and meatballs. This is when both original and changed requirements start falling through the cracks, as fixes are made that cause regression in previously good software.

[sdp3]

Myth 2: "Software is bad because requirements keep changing all the time"

This myth evaporates with change management via end-to-end requirements traceability. The best way to track requirements (and changes in requirements) is to use a relational database of the software development process. Linking each software requirement to design parts, code parts, test cases, and problems found, is the best practice for change management.

The best way to track requirements (and changes in requirements) is to use a relational database of the software development process. Linking each software requirement to design parts, code parts, test cases, and problems found, is the best practice for change management.

Sorting by SRs assures (by exposing holes) that all software development and test steps down the line are planned and done as the software development project unfolds.

[traceability1]

Sorting by DCRs makes it possible to check, at any time, all the rippling effects (all the threads within the shaded area) of the change in conjunction with DCR approval.

[traceability2]

Sorting by PTRs makes it possible to check, at any time, all the rippling effects (all the threads within the shaded area) of PTR disposition -- fix and test, cannot fix, cannot test, postpone fix, etc.

[traceability3]

Myth 3: "Software is bad because there are no practical metrics to track progress"

There are universally applicable stability curves (specifically, s-curves) of requirements, design, code, and test. These curves have been used for over twenty years. There is nothing mysterious about them. Good software development organizations display these curves, updated daily, for each project. Basically, the are the cumulative curves of approved requirements, approved design parts, approved code parts, tested code parts (by phase of testing), problems found (PTRs opened), and problems fixed and retested (PTRs closed). The time-phased stabilization of these curves is a sure sign that the project is under control. It is also a sure sign that high quality software is being developed.

Example 1 - SPI (Feedback from all Phases)

[spicase1]

Example 2 - SPI (No Feedback from White Phases)

[spicase2]

Example 3 - SPI (No Feedback from White Phases)

[spicase3]

Example 4 - SPI (No Feedback from White Phases)

[spicase4]

Example 5 - SPI (No Feedback from White Phases)

[spicase5]

Example 6 - SPI (No Feedback from White Phases + "Expediting")

[spicase6]

Example 7 - Stabilization of the Requirements Phase (SRs):

[metric1]

Example 8 - Stabilization of a Test Phase (1=PTRs opened, 2=PTRs closed):

[metric2]
Time ---->

Summary and recommendations

  • It makes good business sense to develop good software.
  • Myth 1 is managed by working the SDP and SQP as "living documents".
  • Myth 2 is managed by capturing traceabilities with a relational database.
  • Myth 3 is managed by plotting cumulative progress stabilization curves.
  • ISO 9001:2000 and 90003:2004 provide the best guidance for SPI.

BEST PRACTICES

GOOD BOOKS

  • Abdel-Hamid, Tarek and Stuart E. Madnick - Software Project Dynamics - Prentice-Hall, 1991
  • Alexander, Ian - Writing Better Requirements - Addison-Wesley, 2002
  • Alexander, Ian - Scenarios, Stories, Use Cases - John Wiley, 2004
  • Brooks Jr., Frederick P. - The Mythical Man-Month - Addison-Wesley 1975, 1995.
  • Jensen, Randall W. and Charles C. Tonies - Software Engineering - Prentice-Hall, 1979
  • Jones, Capers - Patterns of Software Systems Failure and Success - International Thompson Computer Press, 1996.
  • McCarthy, James - Dynamics of Software Development - Microsoft Press, 1995
  • Puranik, Rajnikant - The Art of Creative Destruction - Shroff Publishers, 2005, 224 pp.
  • Russell, J. P. Ed. - The Quality Audit Handbook - American Society for Quality (ASQ), 1997
  • Steward, Donald V. - System Analysis and Management - Petrocelli Books, 1981

HORROR STORIES