Software Quality Evaluation - PowerPoint PPT Presentation

1 / 9
About This Presentation
Title:

Software Quality Evaluation

Description:

Presentation of artifact for scrutiny by a team of reviewers. Artifact is not always code ... Design specifications, project plan, test plan, etc. Artifact may ... – PowerPoint PPT presentation

Number of Views:40
Avg rating:3.0/5.0
Slides: 10
Provided by: colin91
Category:

less

Transcript and Presenter's Notes

Title: Software Quality Evaluation


1
Software Quality Evaluation
  • CS 3302

2
(Some) Desired Qualities
  • Correctness
  • Does it do/say what it is supposed to
  • How can you tell?
  • Is it reliable?
  • How do you quantify compute?
  • Is it secure?
  • Is it safe?
  • How can you assess hazards?
  • Does it fail in a safe manner?
  • Integrity/security
  • Usability
  • Efficiency
  • Speed
  • Resource consumption
  • InteroperabilityMaintainabilityPortabilityExpan
    dabilityFlexibility customizability
  • How do you assess these properties without
    building the system?
  • Reusability

3
Reviews
  • Presentation of artifact for scrutiny by a team
    of reviewers
  • Artifact is not always code
  • Design specifications, project plan, test plan,
    etc.
  • Artifact may not be a document
  • User interface prototype demonstration
  • Types of review differ in formality and role
    differentiation
  • Goal
  • Review the artifact, not the author
  • "Egoless programming"
  • Raise issues, not solve problems or improve the
    design

4
Review Procedure
  • Preparation
  • Thorough preparation is essential
  • Artifact delivered 1-30 days in advance to review
    team
  • Longer lead time for larger, more significant
    artifacts
  • Presentation may be necessary in advance
  • Only necessary people should review, so everyone
    must review
  • Review meeting
  • Short meetings only (2 hours, max.)
  • Reporting
  • All team members are responsible for quality
  • Summarize findings
  • Report specific findings or raised issues

5
Review Output
  • Summary report
  • What exactly was reviewed
  • Who reviewed it
  • When
  • Summary appraisal
  • Issues list
  • Detailed issues or findings
  • NOT design solutions

6
Types of Review
  • Structured Walkthrough
  • Team structure
  • Presenter author
  • Moderator
  • Recorder
  • Additional reviewers
  • Roles
  • Advocacy roles (e.g. usability, efficiency, etc.)
  • Fagan Inspection
  • Team structure
  • Presenter is not the author
  • Moderator recorder
  • Author present to answer questions of
    clarification
  • Additional reviewers

7
Correctness proofs Evaluation
  • Maturity
  • Widely studied (and taught!) for 20 years
  • Verification of concurrent programs is still
    active research topic
  • Influence on practice
  • Almost none
  • Except in specialized areas, such as cryptography
  • Better formal approach may be to construct
    provably correct programs from formal
    specification
  • Then it becomes more important to prove
    "interesting" consequences of the specification

8
Quality Assurance for NFRs
  • NFR Non-functional requirement
  • Not what the system should do, but how well
  • Also quality requirement
  • Efficiency
  • Performance simulation
  • Formal modeling requires often incorrect
    assumptions
  • Reliability
  • Software reliability is a statistical summary of
    correctness
  • Redundancy less effective than for hardware
  • parallel development may only introduce the same
    faults

9
NFRs continued...
  • Safety
  • Safety is not the same as reliability
  • Hazards must be avoided at all costs
  • System failure may be acceptable as long as
    service degrades gracefully
  • Fault-tree analysis
  • Failure modes effects analysis
  • Simplicity of design combined with pessimism
    about environment (including user behavior)
  • Usability
  • Observation of user interaction is essential
  • Get user to think aloud
  • But short experiments with 3-4 users are often
    sufficient
Write a Comment
User Comments (0)
About PowerShow.com