Evaluation of Information Systems Review - PowerPoint PPT Presentation

1 / 22
About This Presentation
Title:

Evaluation of Information Systems Review

Description:

GQ(I)M. We used the goal, question, indicator, and measure methodology to: Help understand why we are collecting the measures, Define exactly what they mean, ... – PowerPoint PPT presentation

Number of Views:66
Avg rating:3.0/5.0
Slides: 23
Provided by: weva8
Category:

less

Transcript and Presenter's Notes

Title: Evaluation of Information Systems Review


1
Evaluation of Information SystemsReview
  • INFO 630
  • Glenn Booker

2
Measurement Overview
  • We have studied the kinds of measurement methods
    and characteristics needed for software
    development and maintenance
  • Weve used SPSS to do the icky math parts for
    us, and emphasized understanding what kinds of
    techniques are used, and how to interpret their
    results

3
Life Cycle Model
  • Most of our concepts are based on some variation
    of the key software development activities
  • Requirements analysis
  • High and low level design
  • Coding or implementation
  • Various levels of testing

4
Motivation
  • Measurement is critical from a project management
    perspective, to give us an objective basis for
    making decisions and answering questions
  • It also helps fulfill objectives of various
    quality and process models (CMMI, ISO 9000, etc.)

5
Foundation
  • We defined measurements for each aspect of a
    process
  • The product we create
  • The resources (people) we use to create it
  • The processes they follow
  • The tools they use

6
Statistical Concepts
  • Key statistical concepts used include
  • Measurement scales (nominal, etc.)
  • Ratio, proportion, fraction, percent, and rate
  • Modeling to see if theres a statistically
    significant relationship between two measures
  • And assess R squared and T to help make that
    judgement

7
Normal Distribution
  • We reviewed the normal distribution
  • The 95 rule for deciding how close to the mean
    is close enough to be important
  • Six Sigma

8
Core Measurements
  • The most common foundation for measurement is
  • Product Size (LOC or function points)
  • Effort and cost
  • Schedule
  • Problems and defects, including the defect
    density (defects/KLOC)

9
Development and Defect Models
  • We considered development models to relate
    various key measures to each other (COCOMO)
  • And examined defect removal effectiveness

10
Quality Management
  • We looked at key aspects of a Quality Management
    System and key development and maintenance
    metrics often used
  • Defect arrival rate
  • Fix backlog
  • Fix response time
  • Etc.

11
Earned Value
  • We looked to the earned value method for
    comparing the effort, schedule, and work
    accomplished during a project
  • ACWP, BCWP, BCWS, CPI, SPI if you still
    remember what they mean, you deserve a cookie!
  • Cost and schedule variances, schedule slip,
    estimate at completion

12
GQ(I)M
  • We used the goal, question, indicator, and
    measure methodology to
  • Help understand why we are collecting the
    measures,
  • Define exactly what they mean,
  • Where they come from
  • How they will be presented (the indicators)

13
Defect Analysis
  • Defect analysis looked at
  • When a defect was created (injected)
  • When a defect was found (detection)
  • What caused a defect (type of defect, and/or
    orthogonal defect classification)
  • How a defect was found (triggers)

14
Defect Removal Modeling
  • Defect removal modeling defined the defect
    removal effectiveness (DRE) for any set of
    adjacent life cycle phases
  • Then the Rayleigh model provided a mathematical
    expression to help predict how many more defects
    will be found during development

15
Reliability Modeling
  • Static reliability models provide a detailed
    estimate of defects per module based on size and
    complexity metrics
  • Dynamic reliability models provide a broader
    estimate of the defects in the entire project,
    such as the Rayleigh model (from the Weibull
    model)

16
Reliability Modeling
  • Lots of other types of reliability models were
    covered briefly
  • Exponential model for testing
  • Time Between Failure Models
  • Fault Count Models
  • Time Between Failures

17
Customer Satisfaction
  • We discussed ways of conducting Customer
    Satisfaction Surveys
  • Statistical sampling methods
  • How to calculate simple random sample size
  • Analysis of customer satisfaction data

18
Testing Measurements
  • We discussed approaches to testing
  • Testing measurements beyond just defects found
  • Structural and predicate testing
  • Integration testing strategies

19
Object Oriented Measures
  • We covered six common OO metrics
  • Weighted methods per class (WMC)
  • Response for a class (RFC)
  • Lack of cohesion of methods (LCOM)
  • Coupling between objects (CBO)
  • Depth of inheritance tree (DIT)
  • Number of children (NOC)

20
Complexity Metrics
  • We examined internal and external complexity
    metrics
  • Internal size, the Halstead metrics, cyclomatic
    complexity, knots
  • External data structure, fan in, fan out, Henry
    and Kafura, Shepperd
  • And looked at availability metrics

21
Quality Assessment and Audit
  • We looked at quality assessments and quality
    audits, and their common structure
  • Preparation phase
  • Evaluation phase
  • Summarization phase
  • And examined the larger view of the Process
    Assessment Cycle

22
CMMI Structure
  • Finally we reviewed the structure of the CMMI
    process model
  • Staged versus continuous models
  • Process Areas and their goals, etc.
  • And thats it!
Write a Comment
User Comments (0)
About PowerShow.com