Software Testing Strategies and Techniques - PowerPoint PPT Presentation

About This Presentation
Title:

Software Testing Strategies and Techniques

Description:

Level of Coverage. Coverage Types are Apples and Oranges ... Types of Testing Tools. Regression Testing. retest to find side-effects. Database Testing ... – PowerPoint PPT presentation

Number of Views:119
Avg rating:3.0/5.0
Slides: 28
Provided by: StephenD3
Category:

less

Transcript and Presenter's Notes

Title: Software Testing Strategies and Techniques


1
Software Testing Strategies and Techniques
2
But First
  • Where have we been in this class
  • gathering requirements
  • defining the work to be done
  • estimating time and effort
  • scheduling work
  • analyzing risks
  • measuring our work
  • Where are we going next
  • testing our product
  • managing change

3
Starter Questions
  • What is the purpose of testing software?
  • Can we be sure the code is 100 correct?
  • How much "correctness" should we shoot for?
  • What needs to be tested?
  • How do we test those parts/aspects?
  • Given a finite schedule, what should a tester
    spend their time on?
  • Given a finite budget, what can a manager do to
    improve testing effectiveness?

4
Testing Terms
  • Verification - is the product correct
  • Validation - is it the correct product

5
Laws of Testing
  • The best person to test your code is someone
    else.
  • A good test is one that finds an error.
  • Testing can not prove the absence of errors.
  • Complete test coverage is impossible, so
    concentrate on problem areas.
  • It cost a lot less to remove bugs early.

6
Testing Stages
  • Unit Testing
  • modules of code
  • Integration Testing
  • design
  • Validation Testing
  • requirements
  • System Testing
  • system engineering

7
Unit Testing Techniques
  • White Box Testing
  • testing a module of code based on the source code
  • example - basis path testing
  • Black Box Testing
  • testing a module based just on what it is
    supposed to do
  • example - boundary value analysis

8
Code Coverage
  • Types of "Code Coverage"
  • Statement coverage
  • each line of code has been tested
  • Conditional coverage
  • each true/false decision has been tested
  • Path coverage
  • each unique path through the code has been tested

9
Level of Coverage
  • Coverage Types are Apples and Oranges
  • 100 statement coverage ? 100 path coverage
  • 50 path coverage is probable
    better than 90 statement coverage
  • 100 path coverage is usually impractical
  • N if-statements create up to 2n paths
  • loops are worse

10
Basis Path Testing
  • Objective is to test each conditional statement
  • as both true and false
  • Draw a Flow Graph
  • Determine the Cyclomatic Complexity
  • count the regions, or
  • CC E - N 2
  • Max Number of tests CC
  • Derive a basis set of independent paths
  • Generate data to drive each path

11
Example
void delete (int val, struct node head)
struct node temp, prev prev NULL temp
head 1 while (temp!NULL) 2 if
(temp-gtvalue val) 3 break else
4 prev temp temp
temp-gtnext 5 if (prev NULL)
6 head (head)-gtnext else
7 prev-gtnext prev-gtnext-gtnext
12
Example Continued
  • Complexity 4
  • Basis Paths
  • 1 2 4 1 5 7 8
  • 1 2 3 5 7 8
  • 1 2 3 5 6 8
  • 1 5 6 8

13
Example Continued
void delete (int val, struct node head)
struct node temp, prev prev NULL temp
head 1 while (temp!NULL) 2 if
(temp-gtvalue val) 3 break else
4 prev temp temp
temp-gtnext 5 if (prev NULL)
6 head (head)-gtnext else
7 prev-gtnext prev-gtnext-gtnext
Path 1 2 4 1 5 7 8 1 to 2 non empty 2 to 4
deleting non head 5 to 7 deleting non head
Path 1 2 3 5 7 8 1, 2, 3 deleting the head
5 to 7 deleting non head Path 1 2 3 5 6 8
1, 2, 3 deleting the head 5 to 6 deleting the
head Path 1 5 6 8 1 to 5 empty list 5 to
6 deleting the head
14
Generating Paths
  • If a path is not possible, then re-generate the
    list of basis paths
  • An algorithm to generate only feasible paths is
    impossible (provably un-decidable)

15
Boundary Value Analysis
  • Errors usually happen at the ends, not in the
    middle
  • Example
  • delete the head, size1
  • delete the head, sizegt1
  • delete from end of the list, sizegt1
  • delete when size0
  • delete when size?0, not in list

16
Example
  • Create some test cases to verify the new "add
    alarm" feature.

17
Homework Results
1200am 1200pm 1159pm
  1. 1200am
  2. 1201am
  3. 1159am
  4. 1200pm
  5. 1201pm
  6. 100pm
  7. 1259pm

Most common error
Why? 1159 1159am 1200 1200pm 1201
1201pm 1300 100pm
18
Integration Testing
Objective is to test the interfaces between
modules
  • Bottom-Up
  • requires lots of drivers
  • gives real answers early
  • big errors are not found until the end
  • Top-Down
  • requires lots of stubs
  • gives fake answers until the very end
  • big errors found early
  • Sandwich
  • Depth-First

A
D
C
B
G
F
E
Integration Testing necessitates Regression
Testing
19
When do we stop unit and integration testing?
  • Answer when the cost of finding bugs becomes
    greater than the cost of removing bugs.

number of bugs found per day
Time
20
Validation Testing
Objective is to test the Conformity to
Requirements
  • Alpha Testing
  • Beta Testing

21
Usability Testing
  • Interface Design Heuristics
  • Cognitive Walkthroughs

22
System Testing
  • Recovery Testing
  • Security Testing
  • Stress Testing
  • Performance Testing

23
Types of Testing Tools
  • Regression Testing
  • retest to find side-effects
  • Database Testing
  • test the queries
  • build tests based on tables and fields
  • load test the database
  • Load Testing
  • simulate a lot of users - stress test
  • Functional Testing
  • record a user's actions
  • try lots of random user actions
  • build test cases based on Use Case Scenarios
  • Testing Web Applications
  • Unit Testing

24
Example - load testing
http//www.gamcom.com/testingdeliveryexample.htm
25
Example - functional testing - WinRunner
  • To create a test, Mercury WinRunner simply
    records a typical business process by emulating
    user actions, such as ordering an item or opening
    a vendor account. During recording, you can
    directly edit generated scripts to meet the most
    complex test requirements.
  • Next, testers can add checkpoints, which compare
    expected and actual outcomes from the test run.
    Mercury WinRunner offers a variety of
    checkpoints, including test, GUI, bitmap, and Web
    links. It can also verify database values to
    ensure transaction accuracy and database
    integrity, highlighting records that have been
    updated, modified, deleted, and inserted.
  • With a few mouse clicks, Mercury WinRunners
    DataDriver Wizard lets you convert a recorded
    business process into a data-driven test that
    reflects the unique, real-life actions of
    multiple users. For further test enhancement, the
    Function Generator presents a quick and reliable
    way to program tests, while the Virtual Object
    Wizard enables you to teach Mercury WinRunner to
    recognize, record, and replay any unknown or
    custom object.
  • As Mercury WinRunner executes tests, it operates
    the application automatically, as though a real
    user were performing each step in the business
    process. If test execution takes place after
    hours or in the absence of a QA engineer, Mercury
    WinRunners Recovery Manager and Exception
    Handling mechanism automatically troubleshoot
    unexpected events, errors, and application
    crashes to ensure smooth test completion.

26
Example - unit testing java code
  • Teams that have tried manual creation of unit
    tests, will find that the automation provided by
    Agitator makes developer testing practical for
    the first time. Organizations that have deployed
    automated solutions for requirements management,
    configurations management, systems, load, and
    performance testing will find that Agitator is
    the perfect complement to deliver improved
    software quality and reduced overall software
    lifecycle costs.
  • It is said that an effort is not worth doing if
    it can not be measured. Developer Testing is no
    different. For successful application of
    Developer Testing, a total of 20-40 of time
    dedicated to the project will be spent on the
    creation and execution of unit tests. It is
    therefore critical that teams can evaluate that
    they are allocating resources in the best
    possible way. The Agitar Management Dashboard is
    the first of its kind,
  • created specifically to monitor and manage
  • Developer Testing efforts.

27
Rest of Semester
  • Nov 6 - Configuration Management
  • Nov 13 - Software Quality Assurance
  • Nov 20 - Quality Assurance Programs
  • Nov 27 - Student Presentations
Write a Comment
User Comments (0)
About PowerShow.com