Title: The COCOMO II Suite of Software Cost Estimation Models
1The COCOMO II Suite of Software Cost Estimation
Models
- Barry Boehm, USC
- TRW Presentation
- March 19, 2001
boehm_at_sunset.usc.edu
http//sunset.usc.edu/research/cocomosuite
2USC-CSE Affiliates (33)
- Commercial Industry (18)
- Automobile Club of Southern California, C-Bridge,
Daimler Chrysler, EDS, Fidelity Group, Galorath,
Group Systems.Com, Hughes, IBM, Lucent,
Marotz, Microsoft, Motorola, Price Systems,
Rational, Sun, Telcordia, Xerox - Aerospace Industry (9)
- Boeing, Draper Labs, GDE Systems, Litton,
Lockheed Martin, Northrop, Grumman, Raytheon,
SAIC, TRW - Government (3)
- FAA, US Army Research Labs, US Army TACOM
- FFRDCs and Consortia (4)
- Aerospace, JPL, SEI, SPC
- International (1)
- Chung-Ang U. (Korea)
3USC-CSE Affiliates Calendar
- Easy WinWin Web Seminar
- Easy WinWin Hands-on Tutorial
- Tutorial Transitioning to the CMMI via MBASE
- Software Engineering Internship Workshop
- Workshop Spiral Development in the DoD
(Washington DC with SEI) - COCOMO/Software Cost Modeling Forum and Workshop
- Annual Research Review, COTS-Based Systems
Workshop (with SEI, CeBASE) - Ground Systems Architecture Workshop (with
Aerospace, SEI) - LA SPIN, Ron Kohl, COTS-Based Systems Processes
- LA SPIN, High Dependability Computing
- Annual Affiliates Renewal
- Rapid Value/RUP/MBASE Seminar (with C-Bridge,
Rational)
- June 22, 2000
- July 25-26, 2000
- July 27, 2000
- August 24-25, 2000
- September 13-15, 2000
- October 24-27, 2000
- February 6-9, 2001
- February 21-23, 2001
- February 21, 2001
- March 28, 2001
- May 2001
- June 14, 2001
4Outline
- COCOMO II Overview
- Overview of Emerging Extensions
- COTS Integration (COCOTS)
- Quality Delivered Defect Density (COQUALMO)
- Phase Distributions (COPSEMO)
- Rapid Application Development Schedule (CORADMO)
- Productivity Improvement (COPROMO)
- System Engineering (COSYSMO)
- Tool Effects
- Code CountTM
- Related USC-CSE Research
- MBASE, CeBASE and CMMI
- Backup charts
5COCOMO II Book Table of Contents- Boehm, Abts,
Brown, Chulani, Clark, Horowitz, Madachy, Reifer,
Steece, Software Cost Estimation with COCOMO II,
Prentice Hall, 2000
- 1. Introduction
- 2. Model Definition
- 3. Application Examples
- 4. Calibration
- 5. Emerging Extensions
- 6. Future Trends
- Appendices
- Assumptions, Data Forms, Users Manual, CD
Content
CD Video tutorials, USC COCOMO II.2000,
commercial tool demos, manuals, data forms, web
site links, Affiliate forms
6Purpose of COCOMO II
- To help people reason about the
- cost and schedule implications of
- their software decisions
7Major Decision SituationsHelped by COCOMO II
- Software investment decisions
- When to develop, reuse, or purchase
- What legacy software to modify or phase out
- Setting project budgets and schedules
- Negotiating cost/schedule/performance tradeoffs
- Making software risk management decisions
- Making software improvement decisions
- Reuse, tools, process maturity, outsourcing
8Need to ReEngineer COCOMO 81
- New software processes
- New sizing phenomena
- New reuse phenomena
- Need to make decisions based on incomplete
information
9COCOMO II Model Stages
10Relations to MBASE/Rational Anchor Point
Milestones
App. Compos.
Inception
Elaboration, Construction
Transition
LCO, LCA
IOC
SRR
SAT
PDR
Waterfall Rqts.
Development
Prod. Des.
Sys Devel
Inception Phase
Elaboration
Construction
Trans.
LCA
IOC
LCO
MBASE Model-Based (System) Architecting and
Software Engineering
11Early Design and Post-Architecture Model
ö
æ
Environment
ç
Size
Effort
ç
ø
è
s
Multiplier
Environment Product, Platform, People, Project
Factors Size Nonlinear reuse and volatility
effects Process Constraint, Risk/Architecture,
Team, Maturity Factors
(
)
Schedule
Multiplier
12Nonlinear Reuse Effects
Data on 2954 NASA modules Selby, 1988
1.0
1.0
0.70
0.75
0.55
Relative cost
0.5
Usual Linear Assumption
0.25
0.046
0.25
0.5
0.75
1.0
Amount Modified
13 COCOMO II. 2000 Productivity Ranges
Scale Factor Ranges 10, 100, 1000 KSLOC
Development Flexibility (FLEX)
Team Cohesion (TEAM)
Develop for Reuse (RUSE)
Precedentedness (PREC)
Architecture and Risk Resolution (RESL)
Platform Experience (PEXP)
Data Base Size (DATA)
Required Development Schedule (SCED)
Language and Tools Experience (LTEX)
Process Maturity (PMAT)
Storage Constraint (STOR)
Use of Software Tools (TOOL)
Platform Volatility (PVOL)
Applications Experience (AEXP)
Multi-Site Development (SITE)
Documentation Match to Life Cycle Needs (DOCU)
Required Software Reliability (RELY)
Personnel Continuity (PCON)
Time Constraint (TIME)
Programmer Capability (PCAP)
Analyst Capability (ACAP)
Product Complexity (CPLX)
1
1.2
1.4
1.6
1.8
2
2.2
2.4
Productivity Range
14COCOMO Model Comparisons
COCOMO
Ada COCOMO
COCOMO II
COCOMO II
COCOMO II
Application Composition
Early Design
Post-Architecture
Size
Delivered Source Instructions
DSI or SLOC
Application Points
Function Points (FP) and
FP and Language or SLOC
(DSI) or Source Lines of
Language or SLOC
Code (SLOC)
Reuse
Equivalent SLOC Linear
Equivalent SLOC Linear
Implicit in Model
Equivalent SLOC nonlinear
Equivalent SLOC nonlinear
(DM,CM,IM)
(DM,CM,IM)
(AA, SU,UNFM,DM,CM,IM)
(AA, SU,UNFM,DM,CM,IM)
Rqts. Change
Requirements Volatility
RVOL rating
Implicit in Model
Change RQEV
RQEV
rating (RVOL)
Maintenance
Annual Change Traffic
ACT
Object Point ACT
(ACT,SU,UNFM)
(ACT,SU,UNFM)
(ACT)
added modified
Scale (b) in
Organic 1.05 Semidetached
Embedded 1.04 -1.24
1.0
.91-1.23 depending on the
.91-1.23 depending on the
b
MM
a(Size)
1.12 Embedded 1.20
depending on degree of
degree of
degree of
NOM
early risk elimination
precedentedness
precedentedness
solid architecture
conformity
conformity
stable requirements
early architecture, risk
early architecture, risk
Ada process maturity
resolution
team cohesion
process maturity (SEI)
Different Multipliers
Different Rating Scale
15COCOMO II Estimation Accuracy
Percentage of sample projects within 30 of
actuals -Without and with calibration to data
source
16COCOMO II Experience Factory I
Rescope
No
System objectivesfcny, perf., quality
Cost, Sched, Risks
COCOMO 2.0
Yes
Ok?
Corporate parameterstools, processes, reuse
17COCOMO II Experience Factory II
Rescope
Executeprojectto nextMilestone
No
System objectivesfcny, perf., quality
ReviseMilestones,Plans,Resources
Cost, Sched, Risks
COCOMO 2.0
Yes
Ok?
Corporate parameterstools, processes, reuse
M/SResults
Milestone plans, resources
No
Ok?
Milestone expectations
RevisedExpectations
Yes
Done?
No
Yes
End
18COCOMO II Experience Factory III
Rescope
Executeprojectto nextMilestone
No
System objectivesfcny, perf., quality
ReviseMilestones,Plans,Resources
Cost, Sched, Risks
COCOMO 2.0
Yes
Ok?
Corporate parameterstools, processes, reuse
M/SResults
Milestone plans, resources
No
Ok?
Milestone expectations
RevisedExpectations
Yes
AccumulateCOCOMO 2.0calibration data
Done?
RecalibrateCOCOMO 2.0
No
Yes
End
19COCOMO II Experience Factory IV
Rescope
Executeprojectto nextMilestone
No
System objectivesfcny, perf., quality
ReviseMilestones,Plans,Resources
Cost, Sched, Risks
COCOMO 2.0
Yes
Ok?
Corporate parameterstools, processes, reuse
M/SResults
Milestone plans, resources
No
Ok?
ImprovedCorporateParameters
Cost, Sched, Quality drivers
Milestone expectations
RevisedExpectations
Yes
EvaluateCorporateSWImprovementStrategies
AccumulateCOCOMO 2.0calibration data
Done?
RecalibrateCOCOMO 2.0
No
Yes
End
20COCOMO II Book Table of Contents- Boehm, Abts,
Brown, Chulani, Clark, Horowitz, Madachy, Reifer,
Steece, Software Cost Estimation with COCOMO II,
Prentice Hall, 2000
- 1. Introduction
- 2. Model Definition
- 3. Application Examples
- 4. Calibration
- 5. Emerging Extensions
- 6. Future Trends
- Appendices
- Assumptions, Data Forms, Users Manual, CD
Content
CD Video tutorials, USC COCOMO II.2000,
commercial tool demos, manuals, data forms, web
site links, Affiliate forms
21Outline
- COCOMO II Overview
- Overview of Emerging Extensions
- COTS Integration (COCOTS)
- Quality Delivered Defect Density (COQUALMO)
- Phase Distributions (COPSEMO)
- Rapid Application Development Schedule (CORADMO)
- Productivity Improvement (COPROMO)
- System Engineering (COSYSMO)
- Tool Effects
- Code CountTM
- Related USC-CSE Research
- MBASE, CeBASE and CMMI
- Backup charts
22USC-CSE Modeling Methodology
- concurrency and feedback implied
23Results of Bayesian Update Using Prior and
Sampling Information (Step 6)
Language and Tool Experience (LTEX)
24Status of Models
Signif. Variables
Data, Bayes
Literature
Behavior
Delphi
COCOMO II COCOTS COQUALMO Defects in
Defects out COPSEMO CORADMO COSYSMO
161 20 2 2 10
25COCOMO vs. COCOTS Cost Sources
STAFFING
TIME
26COCOTS Effort Distribution 20 Projects
Mean of Total COTS Effort by Activity (
/- 1 SD
)
70.00
61.25
60.00
50.99
50.00
49.07
40.00
31.06
30.00
Person-months
21.76
20.75
20.27
20.00
11.31
10.00
2.35
0.88
0.00
-7.57
-7.48
-10.00
system volatility
glue code
assessment
tailoring
-20.00
27Integrated COQUALMO
COCOMO II
Software development effort, cost
COQUALMO
and schedule estimate
Software Size estimate
Software product,
Defect Introduction
process, computer and
Model
personnel attributes
Number of residual defects
Defect density per unit of size
Defect Removal
Defect removal capability
Model
levels
28COQUALMO Defect Removal Estimates - Nominal
Defect Introduction Rates
Delivered Defects / KSLOC
Composite Defect Removal Rating
29COCOMO II RAD Extension (CORADMO)
RVHL DPRS CLAB
RESL
COCOMO II cost drivers (except SCED)
PPOS
Baseline effort, schedule
RCAP
COCOMO II
RAD Extension
Language Level, experience,...
Effort, schedule by stage
Phase Distributions(COPSEMO)
RAD effort, schedule by phase
30Effect of RCAP on Cost, Schedule
31COPROMO (Productivity) Model
- Uses COCOMO II model and extensions as assessment
framework - Well-calibrated to 161 projects for effort,
schedule - Subset of 106 1990s projects for
current-practice baseline - Extensions for Rapid Application Development
formulated - Determines impact of technology investments on
model parameter settings - Uses these in models to assess impact of
technology investments on cost and schedule - Effort used as a proxy for cost
32Strawman COSYSMO
- Sizing model determines nominal COCOMO II SysE
effort and schedule - Function points/use cases/other for basic effort
- Tool and document preparation separate (?)
- source of effort
- Factor in volatility and reuse
- Begin with linear effort scaling with size (?)
- Cost Schedule drivers multiplicatively adjust
nominal effort and schedule by phase, source of
effort (?) - Application factors
- Team factors
33COCOMO II.1998 Productivity Ranges and Current
Practice
Average Multiplier for 1990s projects
34COSYSMO Factor Importance Rating
Rate each factor H, M, or L depending on its
relatively high, medium, or low influence on
system engineering effort. Use an equal number
of Hs, Ms, and Ls.
N6 3.0 2.5 2.3 1.5 1.7 1.7 1.5 1.5 2.7 2.7 3.
0 2.0 1.5 2.0 1.3
Application Factors __H___Requirements
understanding _M - H_Architecture
understanding _L - H_ Level of service rqts.
criticality, difficulty _L - M_ Legacy transition
complexity _L M COTS assessment complexity _L
- H_ Platform difficulty _L M_Required business
process reengineering ______ TBD Ops. concept
understanding (NH) ______ TBD Team Factors _L -
M_Number and diversity of stakeholder
communities _M - H_Stakeholder team cohesion _M -
H_Personnel capability/continuity __ H__
Personnel experience _L - H_ Process maturity _L
- M_Multisite coordination _L - H_Degree of
system engineering ceremony _L - M_Tool
support ______ TBD ______ TBD
35New Tool Rating Scale
- Basis of Tool Rating Scale
- Breadth of Process Support
- Specification, Analysis, Design, Programming,
Test, CM, QA, Management, etc. - CMM Tool maturity and support
- Degree of Tool Integration
36Code Count
- Suite of 9 counting tools
- COPY LEFTed Full source code
- Ada ASM 1750 C/C
- COBOL FORTRAN Java
- JOVIAL Pascal PL/1
- Counts QA Data (Tallies)
- SLOC Statements By Type
- DSI Comment By Type
37Outline
- COCOMO II Overview
- Overview of Emerging Extensions
- COTS Integration (COCOTS)
- Quality Delivered Defect Density (COQUALMO)
- Phase Distributions (COPSEMO)
- Rapid Application Development Schedule (CORADMO)
- Productivity Improvement (COPROMO)
- System Engineering (COSYSMO)
- Tool Effects
- Code CountTM
- Related USC-CSE Research
- MBASE, CeBASE and CMMI
- Backup charts
38MBASE, CeBASE, and CMMI
- Model-Based (System) Architecting and Software
Engineering (MBASE) - Extension of WinWin Spiral Model
- Avoids process/product/property/success model
clashes - Provides project-oriented guidelines
- Center for Empirically-Based Software Engineering
(CeBASE) - Led by USC, UMaryland
- Sponsored by NSF, others
- Empirical software data collection and analysis
- Integrates MBASE, Experience Factory, GMQM into
CeBASE Method - Goal-Model-Question-Metric method
- Integrated organization/portfolio/project
guidelines - CeBASE Method implements Integrated Capability
Maturity Model (CMMI) and more - Parts of People CMM, but light on Acquisition CMM
39Spiral Model Refinements
- Where do objectives, constraints, alternatives
come from? - Win Win extensions
- Lack of intermediate milestones
- Anchor Points LCO, LCA, IOC
- Concurrent-engineering spirals between anchor
points - Need to avoid model clashes, provide more
specific guidance - MBASE
The
WinWin
Spiral Model
Win-Win
2. Identify Stakeholders
Extensions
win conditions
3.
Reconcile win
1. Identify next-level
conditions. Establish
Stakeholders
next level objectives,
constraints, alternatives
Review, commitment
7.
Evaluate product and
4.
process alternatives.
Resolve Risks
Validate product
6.
and process
definitions
Define next level of product and
5.
Original
process - including partitions
Spiral
40Life Cycle Anchor Points
- Common System/Software stakeholder commitment
points - Defined in concert with Government, industry
affiliates - Coordinated with Rationals Unified Software
Development Process - Life Cycle Objectives (LCO)
- Stakeholders commitment to support system
architecting - Like getting engaged
- Life Cycle Architecture (LCA)
- Stakeholders commitment to support full life
cycle - Like getting married
- Initial Operational Capability (IOC)
- Stakeholders commitment to support operations
- Like having your first child
41Win Win Spiral Anchor Points
(Risk-driven level of detail for each element)
Milestone Element
Life Cycle Objectives (LCO)
Life Cycle Architecture (LCA)
- Elaboration of system objectives and scope of
increment - Elaboration of operational concept by increment
- Top-level system objectives and scope
- - System boundary
- - Environment parameters and assumptions
- - Evolution parameters
- Operational concept
- - Operations and maintenance scenarios and
parameters - - Organizational life-cycle responsibilities
(stakeholders)
Definition of Operational Concept
- Exercise key usage scenarios
- Resolve critical risks
- Exercise range of usage scenarios
- Resolve major outstanding risks
System Prototype(s)
- Top-level functions, interfaces, quality
attribute levels, - including
- - Growth vectors and priorities
- - Prototypes
- Stakeholders concurrence on essentials
Definition of System Requirements
- Elaboration of functions, interfaces, quality
attributes, - and prototypes by increment
- - Identification of TBDs( (to-be-determined
items) - Stakeholders concurrence on their priority
concerns
- Choice of architecture and elaboration by
increment - - Physical and logical components, connectors,
- configurations, constraints
- - COTS, reuse choices
- - Domain-architecture and architectural style
choices - Architecture evolution parameters
- Top-level definition of at least one feasible
architecture - - Physical and logical elements and
relationships - - Choices of COTS and reusable software
elements - Identification of infeasible architecture options
Definition of System and Software Architecture
- Identification of life-cycle stakeholders
- - Users, customers, developers, maintainers,
interoperators, - general public, others
- Identification of life-cycle process model
- - Top-level stages, increments
- Top-level WWWWWHH by stage
- Elaboration of WWWWWHH for Initial Operational
- Capability (IOC)
- - Partial elaboration, identification of key
TBDs for later - increments
Definition of Life- Cycle Plan
- Assurance of consistency among elements above
- - via analysis, measurement, prototyping,
simulation, etc. - - Business case analysis for requirements,
feasible architectures
- Assurance of consistency among elements above
- All major risks resolved or covered by risk
management - plan
Feasibility Rationale
WWWWWHH Why, What, When, Who, Where, How, How
Much
42Clashes Among MBASE Models
Product Model
Process Model
Property Model
Success Model
Structure clash
Product
Model
Process
Model
Property
Model
Success
Model
43MBASE Electronic Process Guide (1)
44MBASE Electronic Process Guide (2)
45Center for Empirically-Based Software Engineering
(CeBASE) Strategic Vision
Strategic Framework
Strategic Framework
Strategic Process Experience
Factory Tailoring G/L Goal-Model-Question-Metric
Tactical Process Model Integration (MBASE)
WinWin Spiral
Empirical Methods Quantitative
Qualitative Experimental
Ethnographic
Observational Analysis Surveys,
Assessments Parametric Models
Critical Success Factors Dynamic Models
Root Cause Analysis Pareto 80-20
Relationships
- Experience Base (Context
Results) - Project, Context Attributes
- Empirical Results References
- Implications and Recommended Practices
- Experience Feedback Comments
- Initial foci COTS-based systems Defect
reduction
46Integrated GMQM-MBASE Experience Factory
-Applies to organizations and projects people,
processes, and products
47CeBASE Method Coverage of CMMI - I
- Process Management
- Organizational Process Focus 100
- Organizational Process Definition 100
- Organizational Training 100-
- Organizational Process Performance 100-
- Organizational Innovation and Deployment 100
- Project Management
- Project Planning 100
- Project Monitoring and Control 100
- Supplier Agreement Management 50-
- Integrated Project Management 100-
- Risk Management 100
- Integrated Teaming 100
- Quantitative Project Management 70-
48CeBASE Method Coverage of CMMI - II
- Engineering
- Requirements Management 100
- Requirements Development 100
- Technical Solution 60
- Product Integration 70-
- Verification 70-
- Validation 80
- Support
- Configuration Management 70-
- Process and Product Quality Assurance 70-
- Measurement and Analysis 100-
- Decision Analysis and Resolution 100-
- Organizational Environment for Integration 80-
- Causal Analysis and Resolution 100
49Outline
- COCOMO II Overview
- Overview of Emerging Extensions
- COTS Integration (COCOTS)
- Quality Delivered Defect Density (COQUALMO)
- Phase Distributions (COPSEMO)
- Rapid Application Development Schedule (CORADMO)
- Productivity Improvement (COPROMO)
- System Engineering (COSYSMO)
- Tool Effects
- Code CountTM
- Related USC-CSE Research
- Backup charts
50Backup Charts
- COCOMO II
- COCOTS
- COQUALMO
- CORADMO
- COSYSMO
51The future of the software practices marketplace
User programming (55M performers in US in year
2005)
Application generators (0.6M)
System integration (0.7M)
Application composition (0.7M)
Infrastructure (0.75M)
52COCOMO II Coverage of Future SW Practices Sectors
- User Programming No need for cost model
- Applications Composition Use application points
- - Count (weight) screens, reports, 3GL routines
- System Integration development of applications
generators and infrastructure software - - Prototyping Applications composition model
- - Early design Function Points and/or Source
Statements - and 7 cost drivers
- - Post-architecture Source Statements and/or
Function - Points and 17 cost drivers
- - Stronger reuse/reengineering model
53Baseline Application Point Estimation Procedure
Step 1 Assess Element-Counts estimate the
number of screens, reports, and 3GL components
that will comprise this
application. Assume the standard definitions of
these elements in your ICASE environment. Step 2
Classify each element instance into simple,
medium and difficult complexity levels depending
on values of characteristic
dimensions. Use the following scheme
Step 3 Weigh the number in each cell using the
following scheme. The weights reflect the
relative effort required to
implement an instance of that complexity level.
Element Type
Complexity-Weight
Simple
Medium
Difficult
Screen
1
2
3
Report
2
5
8
3GL Component
10
Step 4 Determine Application-Points add all
the weighted element instances to get one number,
the Application-Point count. Step 5 Estimate
percentage of reuse you expect to be achieved in
this project. Compute the New Application Points
to be developed NAP
(Application-Points) (100-reuse) / 100. Step 6
Determine a productivity rate,
PRODNAP/person-month, from the following scheme
Step 7 Compute the estimated person-months
PMNAP/PROD.
54New Scaling Exponent Approach
- Nominal person-months A(size)B
- B 0.91 0.01 ?(exponent driver ratings)
- - B ranges from 0.91 to 1.23
- - 5 drivers 6 rating levels each
- Exponent drivers
- - Precedentedness
- - Development flexibility
- - Architecture/ risk resolution
- - Team cohesion
- - Process maturity (derived from SEI CMM)
55Project Scale Factors
PM
3.67
estimated
i
SF
.
0.91
0
.
01
w
Ã¥
i
PMAT
weighted sum of 18 KPA achievement levels
56Reuse and Reengineering Effects
- Add Assessment Assimilation increment (AA)
- - Similar to conversion planning increment
- Add software understanding increment (SU)
- - To cover nonlinear software understanding
effects - - Coupled with software unfamiliarity level
(UNFM) - - Apply only if reused software is modified
- Results in revised Equivalent Source Lines of
Code (ESLOC) - - AAF 0.4(DM) 0.3 (CM) 0.3 (IM)
- - ESLOC ASLOCAAAAF(10.02(SU)(UNFM)),
- AAF lt 0.5
- - ESLOC ASLOCAAAAF(SU)(UNFM)), AAF gt 0.5
57Software Understanding Rating / Increment
58Other Major COCOMO II Changes
- Range versus point estimates
- Requirements Volatility (Evolution) included in
Size - Multiplicative cost driver changes
- - Product CDs
- - Platform CDs
- - Personnel CDs
- - Project CDs
- Maintenance model includes SU, UNFM factors from
reuse model - Applied to subset of legacy code undergoing change
59Process Maturity (PMAT) Effects
- Effort reduction per maturity level, 100 KDSI
project - Normalized for effects of other variables
- Clark Ph.D. dissertation (112 projects)
- Research model 12-23 per level
- COCOMO II subsets 9-29 per level
- COCOMO II.1999 (161 projects)
- 4-11 per level
- PMAT positive contribution is statistically
significant
60Other Model Refinements
- Initial Schedule Estimation
0
.
28
0
.
2
(
B
-
)
0.91
ö
é
ù
æ
SCED
ç
æ
PM
ç
TDEV
3.67
ö
ú
ç
ê
ç
è
ø
100
è
ø
ê
ú
û
ë
where estimated person months excluding
Schedule
multiplier effects
Stage
Optimistic Estimate
Pessimistic Estimate
Application Composition
0.50 E
2.0 E
Early Design
0.67 E
1.5 E
Post-Architecture
0.80 E
1.25 E
- 80 confidence limits 10 of time each below
Optimistic, above Pessimistic
- Reflect sources of uncertainty in model inputs
61Early Design vs. Post-Arch EMs
62COCOTS Backup Charts
- Development and Life Cycle Models
- Research Highlights Since ARR 2000
- Data Highlights
- New Glue Code Submodel Results
- Next Steps
- Benefits
63COCOTS Development Model
64COCOTS Draft Life-Cycle Model
Retirement of System
?
LCO
IOC
?
?
?
?
?
?
?
COCOMO II
COCOMO Maintenance Model
V
Volatility
TR
TR
TR
TR
Transition
Operations
repeating refresh cycles?
end cycle?
start cycle?
development
maintenance
transition?
65- Current Insights into Maintenance Phase Issues
Priority of Activities by Effort Involved and/or
Criticality - Higher
- training S C
- configuration management C
- operations support C
- integration analysis S
- requirements management S C
- Medium
- certification S
- market watch C
- distribution S
- vendor management C
- business case evaluation S
- Lower
- administering COTS licenses C
S - spikes around refresh cycle
anchor points C - continuous
66Data Highlights
67Data Highlights
68New Glue Code Submodel Results
- Current calibration looking reasonably good
- Excluding projects with very large,
- very small amounts of glue code (Effort Pred)
- 0.5 - 100 KLOC Pred (.30) 9/17 53
- 2 - 100 KLOC Pred (.30) 8/13 62
- For comparison, calibration results shown at ARR
2000 - 0.1 - 390 KLOC Pred (.30) 4/13 31
- Propose to revisit large, small, anomalous
projects - A few follow-up questions on categories of code
effort - Glue code vs. application code
- Glue code effort vs. other sources
69- Benefits
- Existing
- Independent source of estimates
- Checklist for effort sources
- (Fairly) easy-to-use development phase tool
- On the Horizon
- Empirically supported, tightly calibrated, total
lifecycle COTS estimation tool
70COQUALMO Backup Charts
- Current COQUALMO system
- Defect removal rating scales
- Defect removal estimates
- Multiplicative defect removal model
- Orthogonal Defect Classification (ODC) extensions
71Current COQUALMO System
COCOMO II
Software development effort, cost and schedule
estimate
COQUALMO
Software Size Estimate
Defect Introduction Model
Software platform, Project, product and personnel
attributes
Number of residual defects Defect density per
unit of size
Defect Removal Model
Defect removal profile levels Automation,
Reviews, Testing
72Defect Removal Rating Scales
COCOMO II p.263
73Defect Removal Estimates - Nominal Defect
Introduction Rates
Delivered Defects / KSLOC
Composite Defect Removal Rating
74Multiplicative Defect Removal Model - Example
Code Defects High Ratings
- Analysis 0.7 of defects remaining
- Reviews 0.4 of defects remaining
- Testing 0.31 of defects remaining
- Together (0.7)(0.4)(0.31) 0.09 of defects
remaining - How valid is this?
- All catch same defects 0.31 of defects
remaining - Mostly catch different defects 0.01 of
defects remaining
75Example UMD-USC CeBASE Data Comparisons
- Under specified conditions,
- Peer reviews are more effective than functional
testing for faults of omission and incorrect
specification(UMD, USC) - Functional testing is more effective than
reviews for faults concerning numerical
approximations and control flow(UMD,USC) - Both are about equally effective for results
concerning typos, algorithms, and incorrect
logic(UMD,USC)
76ODC Data Attractive for Extending COQUALMO - IBM
Results (Chillarege, 1996)
77COQUALMO/ODC Extension Research Approach
- Extend COQUALMO to cover major ODC categories
- Collaborate with industry ODC users
- IBM, Motorola underway
- Two more sources being explored
- Obtain first-hand experience on USC digital
library projects - Completed IBM ODC training
- Initial front-end data collection and analysis
78CORADMO Backup Charts
- Rapid Application Development (RAD) context
- RAD Opportunity Tree and CORADMO schedule drivers
- RAD Capability (RCAP) schedule driver
- Square-root effort-schedule model and RCAP
adjustment
79RAD Context
- RAD a critical competitive strategy
- Market window pace of change
- Non-RAD COCOMO II overestimates RAD schedules
- Need opportunity-tree cost-schedule adjustment
- Cube root model inappropriate for small RAD
projects - COCOMO II Mo. 3.7 ³? PM
80RAD Opportunity Tree
Development process reengineering - DPRS
Reusing assets - RVHL
Reusing assets - RVHL
Eliminating Tasks
Eliminating Tasks
Applications generation - RVHL
Applications generation - RVHL
Design-to-schedule - O
Design-to-schedule - O
Tools and automation - O
Tools and automation - O
Reducing Time Per Task
Reducing Time Per Task
Work streamlining (80-20) - O
Work streamlining (80-20) - O
Increasing parallelism - RESL
Increasing parallelism - RESL
Reducing Risks of Single-Point Failures
Reducing Risks of Single-Point Failures
Reducing failures - RESL
Reducing failures - RESL
Reducing their effects - RESL
Reducing their effects - RESL
RAD
Early error elimination - RESL
Early error elimination - RESL
Reducing Backtracking
Process anchor points - RESL
Reducing Backtracking
Process anchor points - RESL
Improving process maturity - O
Improving process maturity - O
Collaboration technology - CLAB
Collaboration technology - CLAB
Minimizing task dependencies - DPRS
Minimizing task dependencies - DPRS
Activity Network Streamlining
Activity Network Streamlining
Avoiding high fan-in, fan-out - DPRS
Avoiding high fan-in, fan-out - DPRS
Reducing task variance - DPRS
Reducing task variance - DPRS
Removing tasks from critical path - DPRS
Removing tasks from critical path - DPRS
24x7 development - PPOS
24x7 development - PPOS
Increasing Effective Workweek
Increasing Effective Workweek
Nightly builds, testing - PPOS
Nightly builds, testing - PPOS
Weekend warriors - PPOS
Weekend warriors - PPOS
Better People and Incentives
RAD Capability and experience - RCAP
Transition to Learning Organization
Transition to Learning Organization
O
O covered by
O covered by
81RCAPRAD Capability of Personnel
PERS-R is the Early Design Capability rating,
adjusted to reflect the performers capability to
rapidly assimilate new concepts and material, and
to rapidly adapt to change. PREX-R is the Early
Design Personnel Experience rating, adjusted to
reflect the performers experience with RAD
languages, tools, components, and COTS
integration.
82RCAP Example  RCAP Nominal PM 25, M 5, P
5 The square root law 5 people for 5 months 25
PM RCAP XH PM 20, M 2.8, P 7.1 A very
good team can put on 7 people and finish in 2.8
months 20 PM Â RCAP XL PM 30, M 7, P
4.3 Trying to do RAD with an unqualified team
makes them less efficient (30 PM) and gets the
schedule closer to the cube root law (but not
quite 9.3 months gt 7 months)Â
83Effect of RCAP on Cost, Schedule
84COSYSMO Backup Charts
- Background
- Scope
- Strawman Model
- Size complexity
- Cost schedule drivers
- Outputs
- Issues
85Background
- Topic of breakout group at October 2000
COCOMO/SCM Forum - Decided on incremental approach
- Increment I front-end costs of information
systems engineering - Coordinating with development of INCOSE-FAA
systems engineering maturity data repository - Also coordinating with Rational sizing metrics
effort
86COSYSMO Increment I Scope
- Expand COCOMO II to information system
engineering front end costs - Excluding aircraft, printer, etc. system
engineering - sensors a gray area
- Excluding Transition effort for now
- All of Inception and Elaboration effort
- Construction Requirements Deployment 50 of
Design effort
87Proposed System Engineering Scope COCOMO II
MBASE/RUP Phase and Activity Distribution
88Strawman COSYSMO
- Sizing model determines nominal COCOMO II SysE
effort and schedule - Function points/use cases/other for basic effort
- Tool and document preparation separate (?)
- source of effort
- Factor in volatility and reuse
- Begin with linear effort scaling with size (?)
- Cost Schedule drivers multiplicatively adjust
nominal effort and schedule by phase, source of
effort (?) - Application factors
- Team factors
89USC Strawman Sizing Model
- Function points, adjusted for complexity
- Use cases, adjusted for complexity
- flows of events complexity of interactions
- Other rqts. threads features interfaces
- Rqts. Volatility factor similar to COCOMO II
- Reuse factor simpler than COCOMO II (TBD)
- Weighting of FP, use case quantities TBD
- Also use pairwise comparison approach for sizing
- Compare with known systems
- Use COCOMO II CPLX factors for complexity (?)
- Control, computability, device-dependent, data
management, UI operations scales
90Evolving Rational Sizing Model
- Objective Obtain software mass for COCOMO
engine - USC MVC approach
- Model -- number of classes of data
- View -- number of use cases
- Control -- distribution and algorithm
complexity - Size new application by MVC comparison to similar
applications - Overall, very similar to USC strawman sizing
approach - Preparing to collaborate via Philippe Kruchten
91COSYSMO Factor Importance Rating
Rate each factor H, M, or L depending on its
relatively high, medium, or low influence on
system engineering effort. Use an equal number
of Hs, Ms, and Ls.
N6 3.0 2.5 2.3 1.5 1.7 1.7 1.5 1.5 2.7 2.7 3.
0 2.0 1.5 2.0 1.3
Application Factors __H___Requirements
understanding _M - H_Architecture
understanding _L - H_ Level of service rqts.
criticality, difficulty _L - M_ Legacy transition
complexity _L M COTS assessment complexity _L
- H_ Platform difficulty _L M_Required business
process reengineering ______ TBD Ops. concept
understanding (NH) ______ TBD Team Factors _L -
M_Number and diversity of stakeholder
communities _M - H_Stakeholder team cohesion _M -
H_Personnel capability/continuity __ H__
Personnel experience _L - H_ Process maturity _L
- M_Multisite coordination _L - H_Degree of
system engineering ceremony _L - M_Tool
support ______ TBD ______ TBD
92Strawman Model Outputs
- Effort schedule by phase
- By activity ?
- By source of effort (analysis, prototypes, tools,
documents)? - Risk assessment ?
93Issues Suggestions on Improving
- Scope
- Proposed Approach
- Model Form
- Model Elements
- Outputs
- Over/underlaps with COCOMO II, COCOTS, CORADMO
- Sources of data
- Staffing
94Further Information
V. Basili, G. Caldeira, and H. Rombach, The
Experience Factory and The Goal Question Metric
Approach, in J. Marciniak (ed.), Encyclopedia of
Software Engineering, Wiley, 1994. B. Boehm, C.
Abts, A.W. Brown, S. Chulani, B. Clark, E.
Horowitz, R. Madachy, D. Reifer, and B. Steece,
Software Cost Estimation with COCOMO II, Prentice
Hall, 2000. B. Boehm, D. Port, Escaping the
Software Tar Pit Model Clashes and How to Avoid
Them, ACM Software Engineering Notes, January,
1999. B. Boehm et al., Using the Win Win Spiral
Model A Case Study, IEEE Computer, July 1998,
pp. 33-44. R. van Solingen and E. Berghout, The
Goal/Question/Metric Method, McGraw Hill,
1999. COCOMO II, MBASE items
http//sunset.usc.edu CeBASE items
http//www.cebase.org