Evaluation of Information Systems Quality Assessment - PowerPoint PPT Presentation

1 / 38
About This Presentation
Title:

Evaluation of Information Systems Quality Assessment

Description:

We have defined many ways to measure a project, and establish various guides for ... on SCAMPI. INFO 630. Lecture #9. 25. CMMI Structure ... – PowerPoint PPT presentation

Number of Views:33
Avg rating:3.0/5.0
Slides: 39
Provided by: weva
Category:

less

Transcript and Presenter's Notes

Title: Evaluation of Information Systems Quality Assessment


1
Evaluation of Information SystemsQuality
Assessment Process Improvement
  • INFO 630
  • Glenn Booker

2
Time for a Sanity Check
  • We have defined many ways to measure a project,
    and establish various guides for quality
  • Now, as a project is happening, we want to
    determine if we are reaching our goals

3
Assessment versus Audit
  • Here, per Kan, distinguish between a quality
    assessment and a quality audit
  • A quality assessment is an evaluation of the
    project to see if desired goals are being met,
    and identify problem areas
  • A quality audit is used to refer to formal
    external activities, such as an ISO 9000 audit or
    CMMI audit

4
Quality Assessment
  • A quality assessment could be done by an
    independent team, or by people within the
    projects QA organization
  • Assessments can be done at various points in a
    project
  • A typical project might schedule 4-6 major
    quality reviews (Kan, p. 398), evenly spaced
    during the project

5
Quality Assessment
  • Each quality assessment typically has three
    phases
  • Preparation phase
  • Evaluation phase
  • Summarization phase
  • Recommendations and Risk Mitigation
  • The scope of data and measures are key aspects of
    each phase

6
Preparation Phase
  • Given the development process used on a project,
    and the project schedule, determine where the
    project stands relative to them
  • Need to focus on both qualitative and
    quantitative data to determine development
    progress, find problems, and predict final
    product quality

7
Preparation Phase
  • Qualitative data can be obtained from interviews
    (one-on-one or small group) or questionnaires
  • May focus on identifying
  • Whose input is critical right now? Are affected
    all areas represented?
  • Who needs to know whats happening right now?

8
Preparation Phase
  • What are the key risks, and our mitigation
    strategy for each?
  • How does our status compare to similar previous
    projects?

9
Evaluation Phase
  • Now that you have the data on hand, the
    evaluation phase is the act of using various
    techniques to analyze the data, and extract the
    desired information
  • Data might be presented using the indicators
    discussed with GQ(I)M

10
Evaluation Phase
  • Part of the challenge is understanding when you
    have enough data, and how to put the data in
    perspective of the project as a whole
  • Here its critical to choose the right measures
    to evaluate the data, and pick good evaluation
    criteria to judge whats good and bad

11
Summarization phase
  • Now we want to blend the qualitative and
    quantitative results into an overall summary of
    the state of the project
  • Summary is typically broken into major sections,
    depending on the types of activities currently
    occurring
  • Design reviews, code inspections, system testing,
    regression testing, etc.

12
Summarization phase
  • Each sections current status might be summarized
    with a basic stoplight scale (red/yellow/green)
    to indicate how severe problems are in that area
  • These could be tracked over time to show trends
    in quality

13
Recommendations and Risk Mitigation
  • Part of the summary is to give specific
    recommendations
  • Look at current risky areas, and describe what
    activities are needed to bring them under better
    control?
  • This often links to your projects risk
    management strategy, where you identified
    significant risks, and defined a risk mitigation
    strategy for each

14
Risk Mitigation
  • Risk strategies can include
  • Contain the risk, using specific activities to
    reduce the chance of it happening (risk
    prevention) and reduce its impact if it does
    occur (risk mitigation)
  • Develop a contingency plan to manage the risk if
    it occurs

15
Risk Mitigation
  • Transfer to risk to another organization or part
    of the system (let a subcontractor develop a part
    you know poorly, for example)
  • Ignore the risk, if it has low impact
  • Avoid the risk, by using a different process or
    eliminate a risky feature from the product
    requirements

16
Conducting Assessments
  • In planning an assessment, the scope is a
    critical aspect
  • Often an assessment will be limited to part of a
    project, to address areas most likely to be
    critical
  • The emphasis is still on determining whether the
    project is meeting its goals for quality

17
Conducting Assessments
  • Audits are performed in a manner similar to
    assessments
  • The difference in focus is that audits focus on
    comparing processes to a process model (CMMI, ISO
    9000), whereas assessments compare project
    outcomes to its own goals
  • An audits scope could cover several projects
    within one organization

18
Conducting Assessments
  • In addition to the usual models, audits could be
    based on other models
  • Bootstrap
  • Trillium
  • ISO 12207
  • ISO 15504
  • DOD-STD-2167a (obsolete)
  • MIL-STD-498 (obsolete)

19
Conducting Assessments
  • Likewise, the scope of an audit could differ
    based on the model selected
  • ISO 9000 is facility-based, since it was created
    for a manufacturing environment
  • CMMI is organization-based, regardless of where
    the organization is located

20
Process Assessment Cycle
  • The conduct of a process assessment or audit
    typically has six major steps
  • Preplanning before the audit, get sponsorship
    for the audit and define its scope and purpose
  • Planning Select who will conduct the audit,
    when, and provide training if needed

21
Process Assessment Cycle
  • Fact gathering determine how data will be
    collected for the audit interviews, documents,
    demonstrations, questionnaires, etc.
  • Fact analysis analyze the data, and get
    additional data if needed to fill holes in
    understanding the subject
  • Reporting prepare a presentation to give the
    results of the audit

22
Process Assessment Cycle
  • Finally, after the audit, determine how its
    results will impact the projects process
    improvement activities
  • Assessments follow the same activities described
    above and are typically conducted every few
    months
  • Audits have to be repeated every 6-18 months to
    remain valid

23
A Bigger Picture
  • Assessment or audit results can be tracked over
    many events to look for chronic problem areas, or
    help refine process improvement plans

24
More Details on Audits
  • Kans chapter 17 focuses on more details of how
    process audits are conducted
  • Well hit the highlights here
  • For more information, go to the SEI website, and
    look for information on SCAMPI

25
CMMI Structure
  • The CMMI is broken into five cumulative levels of
    maturity (1-5), plus sometimes level 0 is
    possible
  • Level 0 means your project doesnt do everything
    necessary for, e.g. software development
  • Level 1 means your project does stuff, but no one
    knows how

26
CMMI Structure
  • Level 2 is the first level earned by an
    assessment it means a project performs basic
    project and process management activities
  • Level 3 means that an organization (note the
    shift from project) has processes which are
    consistently tailored across several projects

27
CMMI Structure
  • Level 4 means the organization has measured and
    tracked key activities consistently for so long
    that statistical process control is in place
  • Level 5 means the organization actively prevents
    defects from occurring, and is experimenting with
    even better processes (continuous process
    improvement)

28
Staged versus Continuous
  • CMMI can be used in staged or continuous
    representation (form)
  • Staged is the approach just described the
    organization must perform certain types of
    activities to earn an overall maturity rating
  • Continuous form means that each type of activity
    is rated separately, so theres no overall rating

29
Staged versus Continuous
  • In practice, most organizations get audited using
    the staged approach
  • That CMMI Level 5 plaque on the wall is
    impressive!
  • But the work of preparing for audits often
    follows the continuous approach
  • Assess what kind of activities are most in need
    of help (process, project, technical, etc.), and
    work on them first

30
The Bottom Line
  • The overall goal for a CMMI-based assessment is
    to determine if the project or organization
    fulfills the goals of the model
  • Are the goals of each type of activity being met
    consistently across the organization?
  • Often called institutionalization of the activity
    its a way of life for them

31
More Structure
  • CMMI is broken into about 23 Process Areas (PA)
    each is a type of activity mentioned earlier
  • Each PA has a few Specific Goals, which are the
    heart of that activity
  • Then there are more general activities defined
    for each PA doing all of them constitutes
    institutionalization

32
More Structure
  • The general process area structure includes
  • Commitment to Perform means that program
    management defines a policy to do the stuff in
    this PA
  • Ability to Perform means that program
    management provides the resources and training
    needed to do these activities, and plans to do
    them

33
More Structure
  • Directing Implementation means that program
    management manages the processes involved in this
    activity, ensures quality, and coordinates with
    other stakeholders on the project
  • Verifying Implementation covers review of
    activities with higher management, and conducting
    independent process audits

34
More Structure
  • Then the specific goals of each PA are defined in
    more detail by its Specific Practices
  • These describe the types of activities which,
    done together, will fulfill the goals of the
    process area
  • In addition, based on the desired maturity level,
    there are Generic Goals which apply to every PA

Pretty simple, huh?
35
CMMI and Measurement
  • The measurement focus for CMMI is
  • 1) Define a plan of activities,
  • 2) Track actual progress, then
  • 3) Measure actual and planned performance, and
    make decisions based on it
  • At higher levels of maturity, analysis of data
    and prediction occur

36
CMMI and Measurement
  • For example, at Level 2, the Measurement and
    Analysis process area has specific goals of
  • Measurement objectives and activities are aligned
    with identified information needs and objectives.
  • Measurement results that address identified
    information needs and objectives are provided.

From here
37
CMMI and Measurement
  • At level 4, the Quantitative Project Management
    process area has specific goals of
  • The project is quantitatively managed using
    quality and process-performance objectives.
  • The performance of selected subprocesses within
    the project's defined process is statistically
    managed.

38
CMMI in Brief
  • CMMI is mostly looking to see if you say what
    youre going to do, make it possible to do so, do
    it, and measure what you did
  • Following a process improvement path takes lots
    of time and commitment from all levels of
    management
Write a Comment
User Comments (0)
About PowerShow.com