COCOMO II Overview - PowerPoint PPT Presentation

1 / 90
About This Presentation
Title:

COCOMO II Overview

Description:

... The word file refers to a logically related group of data and not ... Assess the usage of software tools used to ... Word Document Microsoft Office ... – PowerPoint PPT presentation

Number of Views:361
Avg rating:3.0/5.0
Slides: 91
Provided by: csew4
Category:

less

Transcript and Presenter's Notes

Title: COCOMO II Overview


1
COCOMO II Overview
LiGuo Huang Computer Science and
Engineering Southern Methodist University
1
2
Agenda
  • COCOMO Introduction
  • Basic Estimation Formulas
  • Cost Factors
  • Reuse Model
  • Sizing
  • Software Maintenance Effort
  • COCOMO Tool Demo
  • Data Collection

2
3
COCOMO Background
  • COCOMO - the COnstructive COst MOdel
  • COCOMO II is the update to COCOMO 1981
  • ongoing research with annual calibrations made
    available
  • Originally developed by Dr. Barry Boehm and
    published in 1981 book Software Engineering
    Economics
  • COCOMO II described in new book Software Cost
    Estimation with COCOMO II
  • COCOMO can be used as a framework for cost
    estimation and related activities

3
4
COCOMO II Model Objectives
  • Provide accurate cost and schedule estimates for
    software projects
  • Enable organizations to easily recalibrate,
    tailor, or extend COCOMO II to better fit their
    unique situations
  • Provide careful, easy-to-understand definitions
    of the models inputs, outputs, and assumptions
  • Provide a constructive model
  • Provide a normative model
  • Provide a evolving model

4
5
COCOMO II Black Box Model
  • Software product size estimate
  • Software product, process, computer, and personal
    attributes
  • Software reuse, maintenance, and increment
    parameters
  • Software organizations Project data

COCOMO
  • Software development and maintenance
  • Costs (effort)
  • Schedule estimates
  • Distributed by phase, activity, increment

COCOMO locally calibrated to organizations data
6
Software Estimation Accuracy
  • Effect of uncertaintiesover time

4x
2x
Relative Size Range
x
0.5x
Initial Operating Capability
OperationalConcept
Life Cycle Objectives
Life Cycle Architecture
0.25x
Feasibility
Plans/Rqts.
Design
Develop and Test
Phases and Milestones
6
7
Major COCOMO II Features
  • Multi-model coverage of different development
    sectors
  • Variable-granularity cost model inputs
  • Flexibility in size inputs
  • SLOCS
  • function points
  • application points
  • other (use cases ...)
  • Range vs. point estimates per funnel chart

7
8
COCOMO Uses for Software Decision Making
  • Making investment decisions and business-case
    analyses
  • Setting project budgets and schedules
  • Performing tradeoff analyses
  • Cost risk management
  • Development vs. reuse decisions
  • Legacy software phaseout decisions
  • Software reuse and product line decisions
  • Process improvement decisions

8
9
Productivity Ranges
  • COCOMO provides natural framework to identify
    high leverage productivity improvement factors
    and estimate their payoffs.

9
10
COCOMO Submodels
  • Applications Composition Model involves rapid
    development or prototyping efforts to resolve
    potential high-risk issues such as user
    interfaces, software/system interaction,
    performance, or technology maturity.
  • sized with application points (weighted screen
    elements, reports and 3GL modules)
  • Early Design model explore alternative
    software/system architectures and concepts of
    operation
  • sized with function points
  • a coarse-grained set of 7 cost drivers
  • Post-Architecture model the actual development
    and maintenance of a software product
  • source instructions and / or function points for
    sizing, with modifiers for reuse and software
    breakage
  • a set of 17 multiplicative cost drivers and a
    set of 5 factors determining the project's
    scaling exponent

10
11
Agenda
  • COCOMO Introduction
  • Basic Estimation Formulas
  • Cost Factors
  • Reuse Model
  • Sizing
  • Software Maintenance Effort
  • COCOMO Tool Demo
  • Data Collection

11
11
12
COCOMO Nominal-schedule Effort Formulation
  • of cost drivers
  • PMNS(person-months) A (Size)E P EMi
  • i 1
  • Where
  • A is a constant derived from historical project
    data (currently A 2.94 in COCOMOII.2000)
  • Size is in KSLOC (thousand source lines of code),
    or converted from function points or object
    points
  • E is an exponent for the diseconomy of scale
    dependent on five additive scale drivers
  • where, B 0.91, SFj is a weighting factor
    for jth scale driver
  • EMi is the effort multiplier for the ith cost
    driver. The geometric product results in an
    overall effort adjustment factor to the nominal
    effort.
  • of cost drivers 16 (exclude SCED)
  • Automated translation effects are not included

12
13
COCOMO Effort Formulation
  • of cost drivers
  • PM(person-months) A (Size)E P EMi
  • i 1
  • of cost drivers 17 (including SCED)

13
14
Diseconomy of Scale
  • Nonlinear relationship when exponent gt 1

14
15
COCOMO Schedule Formulation
TDEV (months) C (PMNS)F(SCED/100)
  • Where
  • TDEV is the schedule estimate of calendar time in
    months from the requirements baseline to
    acceptance
  • C is a constant derived from historical project
    data (currently C 3.67 in COCOMOII.2000)
  • PMNS is the estimated person-months excluding the
    SCED effort multiplier
  • where D 0.28, B 0.91
  • SCED is the compression / expansion percentage
    in the SCED cost driver
  • This is the COCOMOII.2000 calibration
  • Formula can vary to reflect process models for
    reusable and COTS software, and the effects of
    application composition capabilities.

15
16
Multiple Module Effort Estimation
  1. Sum the sizes for all components
  2. Apply the project-level drivers, the Scale
    Factors and the SCED to the aggregated size to
    derive the overall basic effort for the total
    project
  3. Determine each components basic effort
  4. Apply the component-level cost drivers (excluding
    SCED) to each components basic effort
  5. Sum each components effort
  6. Schedule is estimated by repeating steps 2 to 5
    without SCED used in step 2. Then use the
    schedule estimating formula.

16
16
17
Coverage of Different Processes
  • COCOMO II provides a framework for tailoring the
    model to any desired process
  • Original COCOMO was predicated on the waterfall
    process
  • single-pass, sequential progression of
    requirements, design, code, test
  • Modern processes are concurrent, iterative,
    incremental, and cyclic
  • e.g. Rational Unified Process (RUP), the USC
    Model-Based Architecting and Software Engineering
    (MBASE) process
  • Effort and schedule are distributed among
    different phases and activities per work
    breakdown structure of chosen process

17
18
Common Process Anchor Points
  • Anchor points are common process milestones
    around which cost and schedule budgets are
    organized
  • COCOMO II submodels address different development
    stages anchored by these generic milestones
  • Life Cycle Objectives (LCO)
  • inception establishing a sound business case
  • Life Cycle Architecture (LCA)
  • elaboration commit to a single architecture and
    elaborate it to cover all major risk sources
  • Initial Operational Capability (IOC)
  • construction commit to transition and support
    operations

18
19
RUP Phase Distributions
Schedule
Effort
Phase
Inception
10
5
30
20
Elaboration
50
65
Construction
Transition
10
10
100
100
COCOMO Total
100
100
Project Total
19
20
Waterfall Phase Distributions
Schedule
Effort
Phase
20
7
Plans rqts
26
17
Product Design
48
58
Programming
Integration Test
25
26
12.5
12
Transition
100
100
COCOMO Total
132.5
119
Project Total
20
21
MBASE Phase Distributions
Schedule
Effort
Phase
  • see COCOMO II book for complete phase/activity
    distributions

12.5
6
Inception
37.5
24
Elaboration
62.5
76
Construction
12.5
12
Transition
100
100
COCOMO Total
125
118
Project Total
21
22
COCOMO II Output Ranges
  • COCOMO II provides one standard deviation
    optimistic and pessimistic estimates.
  • Reflect sources of input uncertainties per funnel
    chart.
  • Apply to effort or schedule for all of the stage
    models.
  • Represent 80 confidence limits below optimistic
    or pessimistic estimates 10 of the time.

22
23
COCOMO Tailoring and Enhancements
  • Calibrate effort equations to organizational
    experience
  • USC COCOMO has a calibration capability
  • Consolidate or eliminate redundant cost driver
    attributes
  • Add cost drivers applicable to your organization
  • Account for systems engineering, hardware and
    software integration

23
24
Agenda
  • COCOMO Introduction
  • Basic Estimation Formulas
  • Cost Factors
  • Reuse Model
  • Sizing
  • Software Maintenance Effort
  • COCOMO Tool Demo
  • Data Collection

24
24
25
Cost Factors
  • Significant factors of development cost
  • scale drivers are sources of exponential effort
    variation
  • cost drivers are sources of linear effort
    variation
  • product, platform, personnel and project
    attributes
  • effort multipliers associated with cost driver
    ratings
  • Defined to be as objective as possible
  • Each factor is rated between very low and very
    high per rating guidelines
  • relevant effort multipliers adjust the cost up or
    down

25
26
Scale Factors
  • Precedentedness (PREC)
  • Degree to which system is new and past experience
    applies
  • Development Flexibility (FLEX)
  • Need to conform with specified requirements
  • Architecture/Risk Resolution (RESL)
  • Degree of design thoroughness and risk
    elimination
  • Team Cohesion (TEAM)
  • Need to synchronize stakeholders and minimize
    conflict
  • Process Maturity (PMAT)
  • SEI CMM process maturity rating

26
27
Cost Drivers
  • Product Factors
  • Reliability (RELY)
  • Data (DATA)
  • Complexity (CPLX)
  • Reusability (RUSE)
  • Documentation (DOCU)
  • Platform Factors
  • Time constraint (TIME)
  • Storage constraint (STOR)
  • Platform volatility (PVOL)
  • Personnel factors
  • Analyst capability (ACAP)
  • Program capability (PCAP)
  • Applications experience (APEX)
  • Platform experience (PLEX)
  • Language and tool experience (LTEX)
  • Personnel continuity (PCON)
  • Project Factors
  • Software tools (TOOL)
  • Multisite development (SITE)
  • Required schedule (SCED)

27
28
Example Cost Driver - Required Software
Reliability (RELY)
  • Measures the extent to which the software must
    perform its intended function over a period of
    time.
  • Ask what is the effect of a software failure?

28
29
Example Effort Multiplier Values for RELY
  • E.g. a highly reliable system costs 26 more than
    a nominally reliable system 1.26/1.01.26)
  • or a highly reliable system costs 85 more than a
    very low reliability system (1.26/.821.54)

29
30
Scale Factors
  • Sum scale factors Wi across all of the factors to
    determine a scale exponent, E, using E .91
    .01 S Wi

30
31
Precedentedness (PREC) and Development
Flexibility (FLEX)
  • Elaboration of the PREC and FLEX rating scales

31
32
Architecture / Risk Resolution (RESL)
  • Use a subjective weighted average of the
    characteristics

32
33
Team Cohesion (TEAM)
  • Use a subjective weighted average of the
    characteristics to account for project turbulence
    and entropy due to difficulties in synchronizing
    the project's stakeholders.
  • Stakeholders include users, customers,
    developers, maintainers, interfacers, and others

33
34
Process Maturity (PMAT)
  • Two methods based on the Software Engineering
    Institute's Capability Maturity Model (CMM)
  • Method 1 Overall Maturity Level (CMM Level 1
    through 5)
  • Method 2 Key Process Areas(see next slide)

34
35
Key Process Areas
  • Decide the percentage of compliance for each of
    the KPAs as determined by a judgement-based
    averaging across the goals for all 18 Key Process
    Areas.

35
36
Cost Drivers
  • Product Factors
  • Platform Factors
  • Personnel Factors
  • Project Factors

36
37
Product Factors
  • Required Software Reliability (RELY)
  • Measures the extent to which the software must
    perform its intended function over a period of
    time. Ask what is the effect of a software
    failure

37
38
Product Factors (cont.)
  • Data Base Size (DATA)
  • Captures the effect large data requirements have
    on development to generate test data that will be
    used to exercise the program.
  • Calculate the data/program size ratio (D/P)

38
39
Product Factors (cont.)
  • Product Complexity (CPLX)
  • Complexity is divided into five areas
  • control operations,
  • computational operations,
  • device-dependent operations,
  • data management operations, and
  • user interface management operations.
  • Select the area or combination of areas that
    characterize the product or a sub-system of the
    product.
  • See the module complexity table, next several
    slides

39
40
Product Factors (cont.)
  • Module Complexity Ratings vs. Type of Module
  • Use a subjective weighted average of the
    attributes, weighted by their relative product
    importance.

40
41
Product Factors (cont.)
  • Module Complexity Ratings vs. Type of Module
  • Use a subjective weighted average of the
    attributes, weighted by their relative product
    importance.

41
42
Product Factors (cont.)
  • Required Reusability (RUSE)
  • Accounts for the additional effort needed to
    construct components intended for reuse.
  • Documentation match to life-cycle needs (DOCU)
  • What is the suitability of the project's
    documentation to its life-cycle needs.

42
43
Platform Factors
  • Platform
  • Refers to the target-machine complex of hardware
    and infrastructure software (previously called
    the virtual machine).
  • Execution Time Constraint (TIME)
  • Measures the constraint imposed upon a system in
    terms of the percentage of available execution
    time expected to be used by the system.

43
44
Platform Factors (cont.)
  • Main Storage Constraint (STOR)
  • Measures the degree of main storage constraint
    imposed on a software system or subsystem.
  • Platform Volatility (PVOL)
  • Assesses the volatility of the platform (the
    complex of hardware and software the software
    product calls on to perform its tasks).

44
45
Personnel Factors
  • Analyst Capability (ACAP)
  • Analysts work on requirements, high level design
    and detailed design. Consider analysis and design
    ability, efficiency and thoroughness, and the
    ability to communicate and cooperate.
  • Programmer Capability (PCAP)
  • Evaluate the capability of the programmers as a
    team rather than as individuals. Consider
    ability, efficiency and thoroughness, and the
    ability to communicate and cooperate.

45
46
Personnel Factors (cont.)
  • Applications Experience (AEXP)
  • Assess the project team's equivalent level of
    experience with this type of application.
  • Platform Experience (PEXP)
  • Assess the project team's equivalent level of
    experience with this platform including the OS,
    graphical user interface, database, networking,
    and distributed middleware.

46
47
Personnel Factors (cont.)
  • Language and Tool Experience (LTEX)
  • Measures the level of programming language and
    software tool experience of the project team.
  • Personnel Continuity (PCON)
  • The scale for PCON is in terms of the project's
    annual personnel turnover.

47
48
Project Factors
  • Use of Software Tools (TOOL)
  • Assess the usage of software tools used to
    develop the product in terms of their
    capabilities and maturity.

48
49
Project Factors (cont.)
  • Multisite Development (SITE)
  • Assess and average two factors site collocation
    and communication support.
  • Required Development Schedule (SCED)
  • Measure the imposed schedule constraint in terms
    of the percentage of schedule stretch-out or
    acceleration with respect to a nominal schedule
    for the project.

49
50
Cost Factor Rating
  • Whenever an assessment of a cost driver is
    between the rating levels
  • always round to the lower rating
  • e.g. if a cost driver rating is between High and
    Very High, then select High.

50
51
Cost Driver Rating Level Summary
51
52
Cost Driver Rating Level Summary (cont.)
52
53
Dependencies of Cost Factor Ratings
  • RUSE, RELY and DOCU
  • RELY should be rated at least one level below the
    RUSE rating
  • DOCU rating should be at least Nominal for
    Nominal and High RUSE ratings at least High for
    Very High and Extra High RUSE ratings

53
54
Agenda
  • COCOMO Introduction
  • Basic Estimation Formulas
  • Cost Factors
  • Reuse Model
  • Sizing
  • Software Maintenance Effort
  • COCOMO Tool Demo
  • Data Collection

54
54
55
Reused and Modified Software
  • Effort for adapted software (reused or modified)
    is not the same as for new software.
  • Approach convert adapted software into
    equivalent size of new software.

55
56
Nonlinear Reuse Effects
  • The reuse cost function does not go through the
    origin due to a cost of about 5 for assessing,
    selecting, and assimilating the reusable
    component.
  • Small modifications generate disproportionately
    large costs primarily due to the cost of
    understanding the software to be modified, and
    the relative cost of interface checking.

56
57
COCOMO Reuse Model
  • A nonlinear estimation model to convert adapted
    (reused or modified) software into equivalent
    size of new software

57
58
COCOMO Reuse Model (cont.)
  • ASLOC - Adapted Source Lines of Code
  • ESLOC - Equivalent Source Lines of Code
  • AAF - Adaptation Adjustment Factor
  • DM - Percent Design Modified. The percentage of
    the adapted software's design which is modified
    in order to adapt it to the new objectives and
    environment.
  • CM - Percent Code Modified. The percentage of the
    adapted software's code which is modified in
    order to adapt it to the new objectives and
    environment.
  • IM - Percent of Integration Required for Modified
    Software. The percentage of effort required to
    integrate the adapted software into an overall
    product and to test the resulting product as
    compared to the normal amount of integration and
    test effort for software of comparable size.
  • AA - Assessment and Assimilation effort needed to
    determine whether a fully-reused software module
    is appropriate to the application, and to
    integrate its description into the overall
    product description. See table.
  • SU - Software Understanding. Effort increment as
    a percentage. Only used when code is modified
    (zero when DM0 and CM0). See table.
  • UNFM - Unfamiliarity. The programmer's relative
    unfamiliarity with the software which is applied
    multiplicatively to the software understanding
    effort increment (0-1).

58
59
Assessment and Assimilation Increment (AA)
59
60
Software Understanding Increment (SU)
  • Take the subjective average of the three
    categories.
  • Do not use SU if the component is being used
    unmodified (DM0 and CM 0).

60
61
Programmer Unfamiliarity (UNFM)
  • Only applies to modified software

61
62
Commercial Off-the-Shelf (COTS) Software
  • Current best approach is to treat as reuse
  • A COTS cost model is under development
  • Calculate effective size from external interface
    files and breakage
  • Have identified candidate COTS cost drivers

62
63
Reuse Parameter Guidelines
63
64
Automatically Translated Code
  • Reengineering vs. conversion
  • Automated translation is considered separate
    activity from development
  • Add the term (1- AT/100) to the equation for ESLOC

64
64
65
Agenda
  • COCOMO Introduction
  • Basic Estimation Formulas
  • Cost Factors
  • Reuse Model
  • Sizing
  • Software Maintenance Effort
  • COCOMO Tool Demo
  • Data Collection

65
65
66
Lines of Code
  • Code size is expressed in KSLOC
  • Source Lines of Code (SLOCs) logical source
    statements
  • Logical source statements data declarations
    executable statements
  • Executable statements cause runtime actions
  • Declaration statements are nonexecutable
    statements that affect an assembler's or
    compiler's interpretation of other program
    elements

66
67
Lines of Code Counting Rules
  • Standard definition for counting lines
  • Based on SEI definition checklist from
    CMU/SEI-92-TR-20
  • Modified for COCOMO II
  • When a line or statement contains more than one
    type, classify it as the type with the highest
    precedence. Order of precedence is in ascending
    order

67
68
Lines of Code Counting Rules (cont.)
  • (See COCOMO II book for remaining details)

68
69
Counting with Function Points
  • Used in both the Early Design and the
    Post-Architecture models.
  • Based on the amount of functionality in a
    software product and project factors using
    information available early in the project life
    cycle.
  • Quantify the information processing functionality
    with the following user function types

69
70
Counting with Function Points (cont.)
  • External Input (Inputs)
  • Count each unique user data or user control input
    type that both
  • Enters the external boundary of the software
    system being measured
  • Adds or changes data in a logical internal file.
  • External Output (Outputs)
  • Count each unique user data or control output
    type that leaves the external boundary of the
    software system being measured.

70
71
Counting with Function Points (cont.)
  • Internal Logical File (Files)
  • Count each major logical group of user data or
    control information in the software system as a
    logical internal file type. Include each logical
    file (e.g., each logical group of data) that is
    generated, used, or maintained by the software
    system.
  • External Interface Files (Interfaces)
  • Files passed or shared between software systems
    should be counted as external interface file
    types within each system.

71
72
Counting with Function Points (cont.)
  • External Inquiry (Queries)
  • Count each unique input-output combination, where
    an input causes and generates an immediate
    output, as an external inquiry type.
  • Each instance of the user function types is then
    classified by complexity level. The complexity
    levels determine a set of weights, which are
    applied to their corresponding function counts to
    determine the Unadjusted Function Points (UFP)
    quantity.

72
73
Counting with Function Points (cont.)
  • The usual Function Point procedure involves
    assessing the degree of influence of fourteen
    application characteristics on the software
    project.
  • The contributions of these characteristics are
    inconsistent with COCOMO experience, so COCOMO II
    uses Unadjusted Function Points for sizing.

73
74
Unadjusted Function Points Counting Procedure
  • Step 1 - Determine function counts by type
  • The unadjusted function counts should be counted
    by a lead technical person based on information
    in the software requirements and design
    documents.
  • The number of each of the five user function
    types should be counted
  • Internal Logical File (ILF)
  • Note The word file refers to a logically related
    group of data and not the physical implementation
    of those groups of data.
  • External Interface File (EIF)
  • External Input (EI)
  • External Output (EO)
  • External Inquiry (EQ))

74
75
Unadjusted Function Points Counting Procedure
(cont.)
  • Step 2 - Determine complexity-level function
    counts
  • Classify each function count into Low, Average
    and High complexity levels depending on the
    number of data element types contained and the
    number of file types referenced. Use the
    following scheme

75
76
Unadjusted Function Points Counting Procedure
(cont.)
  • Step 3 - Apply complexity weights
  • Weight the number in each cell using the
    following scheme. The weights reflect the
    relative value of the function to the user.
  • Step 4 - Compute Unadjusted Function Points
  • Add all the weighted functions counts to get one
    number, the Unadjusted Function Points.

76
77
Requirement Volatility (REVL)
  • REVL adjust the effective size cause by
    requirements evolution and volatility

SizeD reuse-equivalent of the delivered software
77
77
78
Sizing Software Maintenance
  • (Size)M (Base Code Size) MCF MAF
  • Maintenance Change Factor (MCF)
  • (Size)M (Size Added Size Modified) MCF
    MAF
  • Maintenance Adjustment Factor (MAF)

78
78
79
Agenda
  • COCOMO Introduction
  • Basic Estimation Formulas
  • Cost Factors
  • Reuse Model
  • Sizing
  • Software Maintenance Effort
  • COCOMO Tool Demo
  • Data Collection

79
79
80
Software Maintenance
  • SCED cost driver is not used in maintenance
    effort estimation
  • RUSE cost driver is not used in maintenance
    effort estimation
  • RELY cost driver has a different set of effort
    multipliers
  • See Table 2.41 in COCOMO II book
  • Apply the scaling exponent E to the number of
    changed KSLOC (added and modified, not deleted)
    rather than the total legacy system
  • The average maintenance staffing level FSPM
    PMM /TM
  • May use any desired maintenance activity duration
    TM

80
80
81
COCOMO II Demo
81
82
Agenda
  • COCOMO Introduction
  • Basic Estimation Formulas
  • Cost Factors
  • Reuse Model
  • Sizing
  • Software Maintenance Effort
  • COCOMO Tool Demo
  • Data Collection

82
82
83
Cost Driver Ratings Profile
  • Need to rate cost drivers in a consistent and
    objective fashion within an organization.
  • Cost driver ratings profile
  • Graphical depiction of historical ratings to be
    used as a reference baseline to assist in rating
    new projects
  • Used in conjunction with estimating tools to
    gauge new projects against past ones objectively

83
84
Example Cost Driver Ratings Profile
84
85
Techniques to Generate Cost Driver Ratings Profile
  • Single person
  • Time efficient, but may impose bias and person
    may be unfamiliar with all projects
  • Group
  • Converge ratings in a single meeting (dominant
    individual problem)
  • Wideband Delphi technique (longer calendar time,
    but minimizes biases). See Software Engineering
    Economics, p. 335

85
86
COCOMO Dataset Cost Metrics
  • Size (SLOCS, function points)
  • Effort (Person-hours)
  • Schedule (Months)
  • Cost drivers
  • Scale drivers
  • Reuse parameters

86
87
Recommended Project Cost Data
  • For each project, report the following at the end
    of each month and for each release
  • SIZE
  • Provide total system size developed to date, and
    report new code size and reused / modified code
    size separately. This can be at a project level
    or lower level as the data supports and is
    reasonable. For languages not supported by tools
    such as assembly code, report the number of
    physical lines separately for each language.
  • EFFORT
  • Provide cumulative staff-hours spent on software
    development per project at the same granularity
    as the size components.

87
88
Recommended Project Cost Data (cont.)
  • COST DRIVERS AND SCALE DRIVERS
  • For each reported size component, supply the cost
    driver ratings for product, platform, personnel
    and project attributes. For each reported size
    component, supply scale driver ratings.
  • REUSE PARAMETERS
  • For each component of reused/modified code,
    supply reuse parameters AA, SU, UNFM, DM, CM and
    IM.
  • See Appendix C in COCOMO II book for additional
    data items
  • Post-mortem reports are highly recommended

88
89
Effort Staff-Hours Definition
  • Standard definition
  • Based on SEI definition checklist form
    CMU/SEI-92-TR-21
  • Modified for COCOMO II
  • Does not include unpaid overtime, production and
    deployment activities, customer training
    activities
  • Includes all personnel except level 3 or higher
    software management (i.e. directors or above who
    timeshare among projects)
  • Person-month is defined as 152 hours

89
90
Further Information
  • B. Boehm, C. Abts, W. Brown, S. Chulani, B.
    Clark, E. Horowitz, R. Madachy, D. Reifer, B.
    Steece, Software Cost Estimation with COCOMO II,
    Prentice-Hall, 2000
  • B. Boehm, Software Engineering Economics.
    Englewood Cliffs, NJ, Prentice-Hall, 1981

90
Write a Comment
User Comments (0)
About PowerShow.com