CSE9020 / 1 - PowerPoint PPT Presentation

1 / 34
About This Presentation
Title:

CSE9020 / 1

Description:

Semester 2, 2003 Week 7. CSE9020 / 1. Software Testing and Quality Assurance ... Semester 1, 2004 Week 6. CSE9020 / 2. Reading and References ' ... – PowerPoint PPT presentation

Number of Views:18
Avg rating:3.0/5.0
Slides: 35
Provided by: shonalikri
Category:
Tags: cse9020 | semester

less

Transcript and Presenter's Notes

Title: CSE9020 / 1


1
Software Testing and Quality AssuranceWith
thanks to Shonali Krishnaswamyand Sylvia Tucker
2
Reading and References
  • Software Testing - A Craftsmans Approach,
    (1995) by Paul C. Jorgensen
  • The Complete Guide to Software Testing, (1988),
    by Bill Hetzel

3
Presentation Outline
  • Introduction
  • Terminology
  • Levels of Testing
  • Types of Testing
  • Defect Fixing Strategies
  • Test Process
  • Bottom line

4
Testing and Quality
  • Hetzel - Testing is the process of establishing
    confidence that a program or system does what it
    is supposed to do
  • Quality ltgt Meets requirements
  • Hetzel proposes that Quality is not tangible

5
Testing and Quality
  • The purpose of testing
  • is to make quality visible(? And hence
    tangible ?)
  • Testing is the measurement
  • of software quality (including inspection as
    a testing technique)

6
Terminology
  • Test Plan A document describing the testing
    approach
  • Description of the system to be tested
  • Objectives of testing

7
Terminology - Test Plan Contents
  • Project team
  • Test environment
  • Resource requirements
  • Test Schedule

8
Terminology
  • Test Procedure A document defining the steps
    required to carry out some part of the test plan
    - that is, execute a set of test cases
  • Test case name, objective
  • Description of input required
  • Description of expected output

9
Terminology - Test Procedure
  • Some aspects that need attention
  • Commands or Macros to run the test case
  • Location and format of the actual output
  • Assumptions, Conditions

10
Terminology
  • Test Case A specific set of test data and the
    corresponding expected results for a particular
    test objective.
  • Invalid and Valid test cases
  • Test Script The code written to implement the
    test case

11
Levels of Testing
  • Unit Testing
  • Testing of individual components of a system
    (sometimes called Basic Functionality Testing)

12
Levels of Testing
  • System Testing
  • Testing in the wider scope - a integrated group
    of components and a matching sequence of
    operations

13
Levels of Testing
  • Acceptance Testing
  • Testing to determine if a system is ready for
    implementation
  • (or, - will the user commit to use this
    ? )

14
Types of TestingFunctional
  • For each component, test to confirm that the
    functional requirements have been satisfied.
  • A necessary part of all testing

15
Black Box Testing
  • The components being examined as being closed
    not available to be inspected.
  • Like when you buy a watch, or a DVD player. Then
    internals are a black box

16
Glass Box Testing
  • As opposed to black box testing
  • For each component, the internals are examined,
    and then a decision is made as to the best way of
    making up a detailed test specification

17
Boundary Value Analysis
  • Guidelines
  • Break up the system in to components that have to
    be tested
  • Determine the boundary values for these
    components
  • Design test cases based on the boundary values
  • Limitation Works well with independent variables
    (or components) with well defined boundaries

18
Equivalence Classes
  • Motivations Completeness with minimal or no
    redundancies
  • An equivalence class is a partition of a set
  • Guidelines
  • Identify equivalence classes
  • Design test cases for each equivalence class
  • Example A variable can take only values from 0
    to 9 inclusive.

19
Decision Tables
  • Condition/Action rules are presented in a tabular
    form
  • Test Cases Conditions are inputs,
    Actions are outputs
  • 1 2 3 4 5
  • Fixed rate account T T F F F
  • Variable rate account F F T T F
  • Consumption lt 100 KWH T F T F
  • Consumption gt 100 KWH F T F T
  • Minimum monthly charge X
  • Schedule A billing X X
  • Schedule B billing X
  • Other treatment X

20
Automated vs. Manual Testing
  • Manual
  • Every test case is executed manually
  • Comparison of actual and expected is done
    manually

21
Automated vs. Manual Testing
  • Manual
  • Suitable for interactive tests
  • Tedious and time consuming
  • Simpler

22
Automated vs. Manual Testing
  • Automated
  • A test harness runs the test scripts
  • Analysis of the results is automated and the
    results are logged
  • Testing effort is reduced and may be more
    efficient in the longer term
  • Note The actual test scripts are not generated
    automatically

23
Defect Fixing Strategies
  • Document Defects
  • full and complete information so the defect can
    be reproduced
  • Pass it back to the developer
  • Fix
  • Document the fix
  • Pass it back to the tester
  • Re-test, regression tests(this is where
    automated testing can be effective)

24
Test Process
  • Develop a test plan (and agree on it)
  • Specify the test procedure
  • Perform boundary value analysis /
  • equivalence classes etc.

25
Test Process
  • Design and implement test cases
  • Run the tests
  • Analyse and document results
  • Fix Defects, Run the tests, compare the results,
    and agree that the fixes have worked

26
Testing Web Sites
  • Are essentially client/server applications
  • Consider
  • Interaction between .html pages
  • internet connections
  • firewalls
  • applications that run in web pages (client)
  • applets
  • javascript
  • .net approach

27
Testing Web Sites
  • applications that run on the server
  • cgi scripts
  • database interfaces
  • logging applications
  • dynamic page generators
  • variety of servers, browsers (versions)
  • expected loads on the server, expected
    performance on client, and server
  • investigate web testing tools
  • http//www.softwareqatest.com/qatweb1.html

28
Testing Web Sites
  • Who is the target audience?
  • Browsers, connection speed
  • intranet (fast) or internet (may be sloooooow!)
  • Usability guidelines
  • Pages should be 3-5 screens max., otherwise
    provide internal links within the page
  • Use a consistent layout, navigation
  • Make pages browser-independent, or generate for a
    specific browser
  • Include page owner, revision date, contact link
  • NO dead-end pages!

29
Testing Web Sites
  • On-going
  • Performance Testing
  • Link Testing
  • Security Testing
  • HTML validation (browsers change!)

30
In Summary...
  • Why?
  • Who?
  • When?
  • What?
  • How?

31
Bottom Line
  • Testing is expensive
  • Testing is necessary
  • You are never going to test everything
  • use Risk Analysis to determine where testing
    should be focused

32
Bottom Line
  • Your Case Study Project is not complete until
    your client has completed the Acceptance Testing
    to their satisfaction, and then signed-off the
    project !

33
Bottom Line
  • Most Essential
  • Make sure that
  • the system
  • and the group members
  • maintain versioning control
  • (link Software Testing and Quality Assurance with
    Risks and Ethics)

34
More Maintenance
  • Pilot Entry
  • Left inside main tyre almost needs replacement
  • Mechanic Entry
  • Almost replaced left inside main tyre
  • Pilot Entry
  • Test flight OK, except autoland very rough
  • Mechanic entry
  • Autoland not installed on this aircraft
Write a Comment
User Comments (0)
About PowerShow.com