Title: SOFTWARE METRICS
1SOFTWARE METRICS
- MN3309 Session 27
- Prof. Mark Nissen
2Agenda
- Web Software Example
- Mythical Manmonth
- Software Metrics
- Metrics Exercises
- Summary
3Web Software Example
- Web site has many pages (IOC)
- New addition (FOC)
- Create using word processor, ftp to site
- 10 pages (static) - 10 IF, 10 EQ
- 1 image (x10) - 10 IF
- 10 links to external sites (x10) - 100 EQ
- 10 links from IOC app (x10) - 0 EIF
- Consistency
- Important to be consistent in counting
- Ensure common interpretation/measure
4Counting Example
(0) (0) (10)
(0) (0) (0) (100)
(20)
5Counting Example
6Calibration
- Past project
- Same environment, counting rules
- 25 AFP/page
- 4 SLOC/AFP
- 320 hours for 15 new pages
- Parametrics
- .85 hours/AFP (15 x 25)1.0 (productivity? scale?)
- Ec A(KSLOC) --gt A Ec/KSLOC 213
- Ep 213 x 1.0 213 hours
- Reasonable? Concerns?
7Mythical Manmonth
- Programmers are optimists?
- Manmonths mythical?
- 2 components?
- 30/year max manpower buildup
- ROT for scheduling
- Coding x?
- Planning y? Test integration z
- Gutless estimating?
- Sharp milestones?
8Estimate Bootstrapping
- Perform a sample of work, record time
- Time by task types (plan, meet, code, etc.)
- Use to make calibrate estimates
- 1 new Web page
- 2 hours to create (multiply by 6?)
- How long to plan? Test? Integrate?
- Time associated with old Web pages?
- Time not associated with Web pages?
- Triangulate with other methods
9Software Measurement
- Measurement Motivation
- Measurement Life Cycle
- Metrics Usage Groundrules
- Examples of Metrics
- Cautions About Metrics
10Measurement Motivation
- Key to process improvement
- cant manage what you cant measure
- Monitor risk areas, before crisis
- Basis for rewards incentives
- Key how to measure tech progress?
- C-SAWS?
11Measurement Life Cycle
12Metrics Usage Groundrules
- Metrics must be
- Understandable?
- Economical?
- Field tested
- Highly-leveraged
- Timely
- Improvement-oriented
- Applied to all life cycle phases
- Useful at multiple levels
- Metrics related to estimates?
13Typical Software Metrics
- Quality - user satisfaction, Rome Labs
- Size - SLOC, function/feature points
- Complexity - McCabe, Halstead
- Requirements - Stability, traceable
- Effort productivity - mm/SLOC or FP
- Cost schedule - /mm mo, phased
- Scrap rework - defect/correct rates
- Support - track characteristics (size)
14Army STEP Metrics
- Schedule cost
- Computer resource utilization
- Contractor SEE rating
- Design requirements stability
- Fault profile
- Complexity
- Breadth depth of testing
- Reliability
15F-22 Time-Phased Metrics
16Program Feedback Control
17Cautions About Metrics
- Use as indicators, not absolutes
- Only as good as underlying data
- CDRL items program tracking
- Do not measure everything
- Some metrics universal
- Many program-idiosyncratic
- Evolve with program
- Use multiple metrics, track estimates
- Tie metrics to risk areas problems
18Change Point Tracking
19ERROR SCRs-DISCOVERY CORRECTION
CURRENT STATUS
R
A key positive indicator will be when the
discovery line flattens out as the rate of new
errors encountered approaches zero.
The correction rate for Priority 1 2 SCRs
continues to keep pace with the discovery rate.
20Program Stretch-out Effects 1
Baseline Program
P P-Mo, Mo B/L 140, 5
C
T
D
A
I
21Program Stretch-out Effects 2
Baseline Program
P P-Mo, Mo B/L 140, 5 S-P 140, 7
plan
Stretch-out
22Program Stretch-out Effects 3
Baseline Program
P P-Mo, Mo B/L 140, 5 S-P 140, 7 S-A 215, 8
plan
actual
Core staff
Stretch-out
23Summary
- Measurement is important
- Key to performance improvement
- ID problems in advance of crisis
- Metrics must be tailored
- Some metrics universal
- Many program-idiosyncratic
- Evolve with program