Software Testing and Validation - PowerPoint PPT Presentation

1 / 36
About This Presentation
Title:

Software Testing and Validation

Description:

'Are we building the right product' The software should do what the ... Walkthroughs and Inspections. 4 6 members, chaired by SQA. Preparation lists of items ... – PowerPoint PPT presentation

Number of Views:31
Avg rating:3.0/5.0
Slides: 37
Provided by: coursew
Category:

less

Transcript and Presenter's Notes

Title: Software Testing and Validation


1
Software Testing and Validation
  • MSIT 182 Software Engineering
  • Topic 4

2
Testing, Verification and Validation
  • Verification "Are we building the product
    right"
  • The software should conform to its specification
  • Validation "Are we building the right product"
  • The software should do what the user really
    requires (i.e., assuring that a software system
    meets a user's needs)
  • Testing
  • Establish the presence of defects in
    systems/programs

3
The Verification Validation process
  • Is a whole life-cycle process - verification
    validation must be applied at each stage in the
    software process.
  • Has the following principal objectives/goals
  • The discovery of defects in a system
  • The assessment of whether or not the system is
    usable in an operational situation.
  • Verification and validation should establish
    confidence that the software is fit for purpose
  • This does NOT mean completely free of defects
  • Rather, it must be good enough for its intended
    use and the type of use will determine the degree
    of confidence that is needed

4
Testing, verification and validation and the
software process
5
Software Quality
  • Not excellence
  • Extent to which software satisfies its
    specifications
  • Software Quality Assurance (SQA)
  • Goes far beyond verification validation
  • Managerial independence
  • development group
  • SQA group

6
Static and dynamic verification
  • Software inspections Concerned with analysis of
    the static system representation to discover
    problems (static verification)
  • May be supplement by tool-based document and code
    analysis
  • Nonexecution-based testing
  • Software (or program) testing Concerned with
    exercising and observing product behaviour
    (dynamic verification)
  • The system is executed with test data and its
    operational behaviour is observed
  • Execution-based testing

7
Static verification dynamic validation
8
Defective Software Causes of Errors
  • Specification may be wrong
  • Specification may be a physical impossibility
  • Faulty program design
  • Program incorrect.

9
Defective Software Types of Errors
  • Algorithmic error
  • Computation precision error
  • Documentation error
  • Capacity error or boundary error
  • Timing and coordination error
  • Throughput or performance error
  • Recovery error
  • Hardware system software error
  • Standards procedure errors

10
Program testing
  • Can reveal the presence of errors NOT their
    absence
  • A successful test is a test which discovers one
    or more errors
  • The only validation technique for non-functional
    requirements
  • Should be used in conjunction with static
    verification to provide full verification
    validation coverage

11
Testing and debugging
  • Defect testing and debugging are distinct
    processes
  • Verification and validation is concerned with
    establishing the existence of defects in a
    program
  • Debugging is concerned with locating and
    repairing these errors
  • Debugging involves formulating a hypothesis about
    program behaviour then testing these hypotheses
    to find the system error

12
Types of program testing
  • Defect testing
  • Tests designed to discover system defects.
  • A successful defect test is one which reveals the
    presence of defects in a system.
  • Statistical testing
  • tests designed to reflect the frequence of user
    inputs. Used for reliability estimation.

13
The debugging process
14
Nonexecution-based Testing
  • Underlying principles
  • We should not review our own work
  • Group synergy
  • Walkthroughs and Inspections
  • 46 members, chaired by SQA
  • Preparationlists of items
  • Inspection
  • Up to 2 hours
  • Detect, dont correct
  • Document-driven, not participant-driven
  • Verbalization leads to fault finding
  • Performance appraisal

15
Execution-Based Testing
  • Definitions
  • Failure (incorrect behavior)
  • Fault (NOT bug)
  • Error (mistake made by programmer)
  • Nonsensical statement
  • Testing is demonstration that faults are not
    present
  • The process of inferring certain behavioral
    properties of product based, in part, on results
    of executing product in known environment with
    selected inputs.
  • Inference
  • Known environment
  • Selected inputs

16
What should be tested?
  • Utility
  • Does it meet users needs?
  • Ease of use
  • Useful functions
  • Cost-effectiveness
  • Reliability
  • Frequency and criticality of failure
  • Mean time between failures
  • Mean time to repair
  • Mean time, cost to repair results of failure

17
What should be tested? (contd)
  • Robustness
  • Range of operating conditions
  • Possibility of unacceptable results with valid
    input
  • Effect of invalid input
  • Performance
  • Extent to which space and time constraints are
    met
  • Real-time software

18
The testing process
  • Component testing
  • Testing of individual program components
  • Usually the responsibility of the component
    developer (except sometimes for critical systems)
  • Tests are derived from the developers experience
  • Integration testing
  • Testing of groups of components integrated to
    create a system or sub-system
  • The responsibility of an independent testing team
  • Tests are based on a system specification

19
Testing phases
20
Defect testing
  • The goal of defect testing is to discover defects
    in programs
  • A successful defect test is a test which causes a
    program to behave in an anomalous way
  • Tests show the presence not the absence of defects

21
Testing priorities
  • Only exhaustive testing can show a program is
    free from defects. However, exhaustive testing
    is impossible
  • Tests should exercise a system's capabilities
    rather than its components
  • Testing old capabilities is more important than
    testing new capabilities
  • Testing typical situations is more important than
    boundary value cases

22
Test data and test cases
  • Test data Inputs which have been devised to
    test the system
  • Test cases Inputs to test the system and the
    predicted outputs from these inputs if the
    system operates according to its specification

23
The defect testing process
24
Black-box testing
  • An approach to testing where the program is
    considered as a black-box
  • The program test cases are based on the system
    specification
  • Test planning can begin early in the software
    process

25
Black-box testing
26
Structural testing
  • Sometime called white-box testing
  • Derivation of test cases according to program
    structure. Knowledge of the program is used to
    identify additional test cases
  • Objective is to exercise all program statements
    (not all path combinations)

27
White-box testing
28
Integration testing
  • Tests complete systems or subsystems composed of
    integrated components
  • Integration testing should be black-box testing
    with tests derived from the specification
  • Main difficulty is localising errors
  • Incremental integration testing reduces this
    problem

29
Incremental integration testing
30
Approaches to integration testing
  • Top-down testing
  • Start with high-level system and integrate from
    the top-down replacing individual components by
    stubs where appropriate
  • Bottom-up testing
  • Integrate individual components in levels until
    the complete system is created
  • In practice, most integration involves a
    combination of these strategies

31
Top-down testing
32
Bottom-up testing
33
Testing approaches
  • Architectural validation
  • Top-down integration testing is better at
    discovering errors in the system architecture
  • System demonstration
  • Top-down integration testing allows a limited
    demonstration at an early stage in the
    development
  • Test implementation
  • Often easier with bottom-up integration testing
  • Test observation
  • Problems with both approaches. Extra code may be
    required to observe tests

34
Interface testing
  • Takes place when modules or sub-systems are
    integrated to create larger systems
  • Objectives are to detect faults due to interface
    errors or invalid assumptions about interfaces
  • Particularly important for object-oriented
    development as objects are defined by their
    interfaces

35
Stress testing
  • Exercises the system beyond its maximum design
    load. Stressing the system often causes defects
    to come to light
  • Stressing the system test failure behaviour..
    Systems should not fail catastrophically. Stress
    testing checks for unacceptable loss of service
    or data
  • Particularly relevant to distributed systems
    which can exhibit severe degradation as a
    network becomes overloaded

36
When Testing and Validation Can Stop?
  • Only when the product has been irrevocably
    retired
  • Because it is a whole life-cycle process -
    testing validation must be applied at each
    stage in the software process including
    maintenance/evolution phase.
Write a Comment
User Comments (0)
About PowerShow.com