Title: Testing Plan
1Testing Plan
2The Goal of Test Planning
- The IEEE Standard 829-1998
- To prescribe the scope, approach, resources, and
schedule of the testing activities. To identify
the item being tested, the features to be tested,
the testing tasks to be performed, the personnel
responsible for each task, and the risk
associated with the plan.
3The Goal of Test Planning
- To establish the list of tasks which, if
performed, will identify all of the requirements
that have not been met in the software. The main
work product is the test plan.
4Test Methodology
- Test Plan
- Test Design
- Test Implementation
- Results Analysis and Documentation
5Test Plan
- The test plan is a by-product of the detailed
planning process thats undertaken to create it.
Its the planning process that matters, not the
resulting document.
6(..cont) Test Plan
- The test plan documents the overall approach to
the test. In many ways, the test plan serves as a
summary of the test activities that will be
performed. - It shows how the tests will be organized, and
outlines all of the tester needs which must be
met in order to properly carry out the test. - The test plan should be inspected by members of
the engineering team and a senior manager.
7(..cont) Test Plan
- The initial test plan is abstract, and the final
test plan is concrete (fuzzy to focus). - The initial test plan contains high level ideas
about testing the system without getting into the
details of exact test cases. - The most important test cases come from the
requirement of the system.
8Steps for Test Plan
- Plan the test phase
- Define test strategy
- Planning the resource requirement
- Distribute tester assignment
- Plan test schedule
9Test Phase
- Based on the proposed development model and
decide whether unique phases, or stages, of
testing should be performed over the course of
the project. - In code-and-fix model, theres probably only one
test phase test until someone yells stop. - In waterfall and spiral model, there can be
several test phases from examining the product
spec to acceptance testing.
10(..cont) Test Phase
- Each phase must have criteria defined for it that
objectively and absolutely declares if the phase
is over and the next one has begun.
11Test Strategy
- Test strategy describes the approach that the
test team will use to test the software both
overall and in each phase. - When you are presented with a product spec
- youll need to decide if it is better to use
black-box or white-box testing? - If mix when will you apply each and which part?
- Manually or use tools? If using tools, develop
the tools or purchase? -
12Resource Requirement
- The process of deciding whats necessary to
accomplish the testing strategy. - Everything that could possibly be used for
testing over the course of the project needs to
be considered.
13(..cont) Resource Requirement
- Example
- People How many, what experience, what
expertise? Should they be full-time, part-time,
contract or student? - Equipment Computers, test hardware, printers,
tools. - Office and lab space How big? How will they be
arranged? - Software Word processors, databases, custom
tools. What will be purchased? What need to be
written? - Outsource companies Will they be used? What
criteria? Cost?
14Tester Assignments
- Break out the individual tester assignments.
- The inter-group responsibilities discussed
earlier dealt with what functional group
(management, test, programmers, and so on) is
responsible for what high-level task. - Identify the testers responsible for each area of
the software and for each testable feature.
15Example of Tester Assignments
16Test Schedule
- The test schedule takes all the information
presented so far and maps it into the overall
project schedule. - Critical stage because a few highly desired
features that were thought to be easy to design
and code may turn out to be very time consuming
to test. - Gantt Chart (Tools Microsoft Project).
17(..cont) Test Schedule
- An important consideration with test planning is
that the amount of test work typically is not
distributed evenly over the entire product
development cycle. - Some testing occurs early in the form of spec and
code reviews, tool development, etc, but the
number of testing tasks and the number of people
and amount of time spent testing often increases
over the course of the project.
18Example of Test Schedule
- Testing Task Date
- Test Plan Complete 05/03/2008
- Test Cases Complete 01/06/2008
- Test Pass 1 15/06/2008 01/08/2008
- Test Pass 2 15/08/2008 01/10/2008
- Test Pass 3 15/10/2008 15/11/2008
19Test Design
- Definition by IEEE 829
- Test design specification refines the test
approach (defined in the test plan) and
identifies the features to be covered by the
design and its associated tests. It is also
identifies the test cases and test procedures, if
any, required to accomplish the testing and
specifies the feature pass/fail criteria.
20(..cont) Test Design
- Based on IEEE 829 standard, test design should
include - Identifiers
- Features to be tested
- Approach
- Test case identification
- Pass/fail criteria
21Identifiers
- A unique identifier that can be used to reference
and locate the test design spec. - The spec should also reference the overall test
plan and contain pointers to any other plans or
specs that it references.
22Features to be tested
- A description of the software feature covered by
the test design spec. - Example the addition function of calculator,
font size selection and display in WordPad, and
video card configuration testing of QuickTime.
23Approach
- A description of the general approach that will
be used to test the features. - It should expand on the approach, if any, listed
in the test plan, describe the technique to be
used, and explain how the results will be
verified. - Example A testing tool will be developed to
sequentially load and save pre-built data files
of various sizes. The number of data files,
sizes, and the data they contain will be
determined through black-box technique and
supplemented with white-box examples from the
programmer. A pass or fail will be determined by
comparing the saved file bit-for-bit against the
original using a file compare tool.
24Test case identification
- A high-level description and references to the
specific test cases that will be used to check
the feature. - It should list the selected equivalence
partitions and provide references to the test
cases and test procedures used to run them.
25(..cont) Test case identification
- Example
- Check the highest possible value Test case ID
15326 - Check the lowest possible value Test case ID
15327
26(..cont) Test case identification
- Its important that the actual test case values
arent defined in this section. - For someone reviewing the test design spec for
proper test coverage, a description of the
equivalence partitions is much more useful than
the specific values themselves.
27Pass/Fail Criteria
- Describe exactly what constitutes a pass and a
fail of the tested feature. - What is acceptable and what is not?
- Simple a pass is when all the test cases run
without finding a bug. - Fuzzy a failure is when 10 or more of the test
cases fail.
28Test Case Planning why?
- Organization
- Even a small software may have thousands of test
cases. - The test cases have been created by several
testers over the course of several months or
years. - Proper planning will organize them so that all
the testers and other project team members can
review and use them effectively.
29(..cont) Test Case Planning why?
- Repeatability
- The same test cases can be used several times to
look for new bug, and makes sure old one get
fixed. - Without test case planning, it would be
impossible to know what test cases were last run
and exactly how they were run so that we can
repeat the exact tests.
30(..cont) Test Case Planning why?
- Tracking
- How many test cases did you plan to run?
- How many did you run on the last software
release? - How many passed and how many failed?
31(..cont) Test Case Planning why?
- Proof of testing (or not testing)
- In high-risk industries, the software test team
must prove that it did run the tests that it
planned to run. - It could actually be illegal, and dangerous, to
release software in which few test caes were
skipped. - Proper test case planning and tracking provides a
means for proving what was tested.
32Test Case - Definition
- IEEE 839
- Test case specification documents the actual
values used for input along with the anticipated
outputs. A test case also identifies any
constraints on the test procedures resulting from
use of that specific test case.
33Test Case Specs
- IEEE 829 standard lists some important
information that should be included in the test
case specs - Identifiers
- Test item
- Input specification
- Output specification
- Environmental needs
- Special procedural requirements
- Intercase dependencies
34(..cont) Test Case Specs
- Identifiers
- A unique identifier is referenced by the test
design specs and the test procedure specs. - Test item
- Describes detailed feature, code module, etc
thats being tested. - Provide references to product specs or other
design docs from which the test case was based.
35(..cont) Test Case Specs
- Input specification
- List of all the inputs or conditions given to the
software to execute the test case. - If you are testing a file-based product, it would
be the name of the file and a description of its
contents. - Output specification
- Results you expect from executing the test case.
- Did all the contents of the file load as expected?
36(..cont) Test Case Specs
- Environmental needs
- Hardware, software, test tools, facilities,
staff, etc that are necessary to run the test
case. - Special procedural requirements
- Describe anything unusual that must be done to
perform the test. - Example testing nuclear power plant..?
37(..cont) Test Case Specs
- Intercase dependencies
- If the test case depends on another test case or
might be affected by another, that information
should be included here.
38(..cont) Test Case Specs
- Example Test case presented in the form of
matrix or table - Test Case ID Printer Mfg Model
Mode Option -
- WP0001 Canon BJC-700
B/W Text - WP0002 HP LaserJet IV
Color Auto - .
- .
- WP00010 HP LaserJet IV
High Draft
39Test Procedure / Implementation
- Execute the test cases.
- IEEE 829 defines the test procedure specification
identifies all the steps required to operate
the system and exercise the specified test cases
in order to implement the associated test design.
40Test Procedures
- How to perform the test cases.
- Information need to be defined
- Identifiers
- Purpose
- Special requirement
- Procedure steps
41Test Procedure
- Identifiers
- A unique identifier that ties the test procedure
to the associated test cases and test design. - Purpose
- The purpose of the procedure and reference to the
test cases that it will execute. - Special requirements
- Other procedures, special testing skills, or
special equipment needed to run the procedure.
42Test Procedure
- Procedure steps
- Detailed description of how the tests are to be
run - Log Tells how and by what method the results
and observations will be recorded. - Setup How to prepare for the test.
- Start Steps used to start the test.
- Procedure Steps to run the test.
- Measure How the results are to be determined?
43(..cont) Procedure Steps
- Shut down Steps for suspending the test for
unexpected reasons. - Restart How to pick up the test at certain
point if theres failure or after shutting down. - Stop Steps for an orderly halt to the test.
- Wrap up How to restore the environment to its
pre-test condition. - Contingencies What to do if things dont go as
planned.
44Example of test procedure
- Identifier WinCalcProc98.1872
- Purpose This procedure describe the steps
necessary to execute the Addition function test
cases through - Special Requirements No special requirement
(Individual test cases). - Procedure Steps
- Log The tester will use WordPad with the
Testlog template - Setup The tester must install a clean copy of
- Start 1) Boot up Windows 98
- 2) Click the Start button
- 3)
- Procedure For each test case identified above,
enter the test input data using the keyboard - Measure
45Result Analysis and Documentation
- Fundamental principles for reporting a bug
- Report bugs as soon as possible
- Effectively describe the bugs
- Minimal, singular, obvious and general etc.
- Be non-judgmental in reporting bugs
- professionally written against the product, not
the person, state only facts. - Follow up on your bug reports
- Make sure it reported properly, and given
attention that it needs to be addressed. - Get them fixed.
46Not all bugs are created equally
- A bug that corrupts a users data is more severe
than one thats a simple misspelling. - But, what if the data corruption can occur only
in a very rare instance that no user is ever
likely to see it and the misspelling caused every
user to have problems installing the software? - Which one is more important to fix?
47How to classify bugs?
- General concept used
- Severity
- Indicates how bad the bug is the likelihood and
the degree of impact when the user encounters the
bug. - Priority
- Indicates how much emphasis should be placed on
fixing the bug and the urgency of making the fix.
48Example
- Severity
- System crash, data loss, data corruption,
security breach. - Operational error, wrong results, loss of
functionality. - Minor problem, misspelling, UI layout, rare
occurrence. - Suggestion
- Priority
- Immediate fix, blocks further testing, very
visible. - Must fix before the product is released.
- Should fix when time permits.
- Would like to fix but the product can be release
as it is. - Data corruption Severity 1, Priority 3.
- Misspelling Severity 3, Priority 2.