Title: Defect Removal Effectiveness Model
1Defect Removal Effectiveness Model
- Product Quality is Based on Many Items
- People
- Training
- Tools
- Activities
- Process
- Our software development Process should include
- Defect Prevention Activities
- Defect Removal Activities
2Defect Prevention Removal
- Defect Prevention Activities
- Reuse of Proven, Existing Material
- Defensive Software Activities
- Well Documented Requirements and Tracking
- Business Process and Usage
- User Background
- Constraints
- Well Documented Design and Tracking
- Data Flow
- Functions
- Inter-related Structures
- Error Conditions
- Well Commented Source Code
- Pre and Post Condition Statements in Source Code
- Use Available Programming Constructs
- Switch vs multiple ifs
- Functions vs Multi-Parameter Subroutine Calls
- Abstract class and Interface (java)
- Configuration and Change Management
3Defect Prevention Removal
- Defect Removal Activities
- Pre-coding
- Reviews and Inspections
- Coding
- Reviews and Inspections
- Unit Test
- Post-coding
- Functional Test and fixes
- Component Test and fixes
- System and Regression Test and fixes
- Post-release
- Customer Support and fixes
4(Testing) Reviews and Inspections
- Mostly Applied to Code and Pre-code Materials
- Involves one or more people other than the author
- Requires certain amount of preparation
- Examining the material for completeness ,
correctness, and consistency - Compare against the material from the previous
activity(s) - Analyze the correctness of the result produced by
the activity - Focused on discovering defects
- Activity is relatively static or non-execution
oriented - Requires recording of problems found
- Requires the follow up on fixes of the problems
and on inspection results
5(Testing) Machine Testing
- Applied to Machine Executable Material
- Code
- On-line Help
- Messages and Information Boxes
- Other User Interface
- Performed by Author(s) and Mostly Others
- Has Several Major Steps
- Test planning and preparation
- Development of test scenarios and test cases
- Running the tests
- Recording the problems found and managing the
fixes - Analysis of the test results
-
6More on Testing
Requirements Design Specs
Executable Code, Help,Messages Etc.
Test Scenarios Test Cases
How do these three sets interact and relate ?
- size amount of material - overlaps
coverage of specs by executables by tests
7White Box and Black Box Testing
Executables more than specs
Executables less than specs
code
requirement
test
test
Use Black Box Functionally Oriented
without looking at the inside of the actual
executables
Use White Box Coverage Oriented
after looking at the inside of the actual
executables
.VS.
Most of Us Need Both
8A Test Case Example
Test Case Purpose Any
Pre-Condition Input(s)
Expected Output(s) Any Post-Condition
Test Results Test Date Test Person
Actual Result Problem Description Fix
Status
9 Defect Removal Activity model
Number of errors introduced in this activity
Number of defects found and removed in this
activity
Number of defects upon exit an activity
Number of defects upon entering an activity
Defect Removal Activity
How may we want to represent defect removal
effectiveness (DRE) ? DRE ( defects found
removed) / (defects on entry)(defects
introduced) 1 (best) to 0 (worst)
10A Complete Set of Defect Removal Activities
Software Activities
Hi-lev Design
Lo-lev Design
Code/ Unit Test case
System Test cases
Req Gath
.
Defect Removal
Total
n
n
Req Insp
1,1
1.
n
n
n
Des Insp
2,1
2.
2,2
. . . . .
n
n
C Test
i.1
i,j
Sys Test
Total
n
N
n
.2
.1
Where n i,j is of error removed by removal
activity i on artifact j
11Defect Removal Effectiveness Metrics
Defect Removal Activity, i, Effectiveness or DRA
E may be
i
1) n / N
i .
2) n / (( Sum (n ) where j i) - (Sum
(n ) where m
. j
i .
m .
Note that Customer Found Problems and Unfound
Errors are not included in the metrics discussion.
12An Example with Numbers
Software Activities
Req Gath
Hi-lev Design
Lo-lev Design
Code/ Unit T
System Test
Total
.
Defect Removal
45
45
Req Insp
12
31
43
Des Insp
. . . . .
24
4
C Test
n
Sys Test
i,j
Total
95
340
85
Where n is of error removed by removal i on
activity j
i,j
13Numerical Example
- High Level Design Inspection Effectiveness
Metrics - 1. 43 / 340 .126
- 2. 43 / ( (9585) (45) ) 43/135 .31
- (Remember not all potential defects are
accounted for )
14Is There Any Possibility of Projecting Field
Problems ?
- Possibly with 2 Big Assumptions
- Let MP problems found in all the inspections
and reviews - Let PTR problems found in all the tests
- Let Q problems to be found by customers (after
release in the field) - Let UP defects never found as problems
- Let TD total defects of the software
- So TD MP PTR Q UP
- Assumption 1 UP 0
- So TD MP PTR Q
- Assumption 2 Effectiveness of
Inspections Effectiveness of Tests - So MP/TD PTR / (TD-MP)
- Or MP/PTR TD/(TD-MP)
- Or MP/PTR (MPPTRQ) / ((MPPTRQ) - MP)
- Or MP/PTR (MPPTRQ) / (PTR Q)
- Or MP(PTR Q) PTR (MPPTRQ)
15Defining Defect Removal Effectiveness as before
- Effectiveness of Inspections Reviews, E1
- E1 MP / TD
- Effectiveness of Testing , E2
- E2 PTR / (TD MP)
your thoughts?
Check text book page 174s discussion on µ, and
higher the value of µMP/PTR implies more
effectiveness is the front-end defect removal.
Consider MP 2, PTR 1, and
TD 4. then µ MP/PTR 2 /1 2 - - -
says front end more effective? E1 2/4
1/2 E2 1/ (4 -2) 1/2 E1 ½ E/2
- - - - same effectiveness !? What do you
think about µ ?
16An Interesting Note from the Book (p175-177)
- Using the same previous notations and the
assumption that MP/TD PTR / (TD-MP) and µ
MP/PTR, - Q TD/ µ2
-
- So if we can project total defects and know µ ,
then we can also project Q.
try manipulating from the previous Q PTR2 /
(MP-PTR)
17Defect removal effectiveness and CMM
- Its been estimated (by Capers Jones) that defect
removal effectiveness differs by different levels
of process capability maturity levels - Level 1 85
- Level 2 89
- Level 3 91
- Level 4 93
- Level 5 95
Caution Not clear what really is here thus
not clear what removal
effectiveness really means here. Discussion?