Title: CS 432 Object-Oriented Analysis and Design
1CS 432 Object-Oriented Analysis and Design
2Software Testing
Testing is the process of exercising a program
with the specific intent of finding errors prior
to delivery to the end user.
You cant test in quality. If its not there
before you begin testing, it wont be there when
youre finished testing.
3Testing
- Testing is a process of identifying defects
- Develop test cases and test data
- A test case is a formal description of
- A starting state
- One or more events to which the software must
respond - The expected response or ending state
- Test data is a set of starting states and events
used to test a module, group of modules, or
entire system
4 Testing discipline activities
5Testing is Use Case Driven
- Terms associated with testing
- Verification are we building the system right?
- Validation are we building the right product?
- Testing the Use Case Model is a validation task
that determines whether the customers needs are
being met by the application. - Testing of remaining models is a verification
task that determines whether the application
correctly satisfies the requirements specified in
the Use Case Model
6(No Transcript)
7Testing Workflow
- Defines the primary activities required to
develop the Test Model - Develop Test Plan
- Outline the testing strategies to be used
including model tests, - Estimating the resources required to complete the
testing activities, - Scheduling the testing as part of the overall
project plan.
8Testing Workflow
- Design Model Tests
- Determine strategies for testing a models
correctness, - Determine strategies for testing a models
completeness, - Determine strategies for testing a models
consistency, - Design Test
- Identify and describe test cases for each build
- Identify and structure test procedures specifying
how to perform or carry out the test cases that
test the executable application components and
their integration,
9Testing Workflow
- Implement Test automate test procedures by
creating test components that define executable
components that automate all or part of the test
procedures, if possible - Perform Tests execute the tests cases as part
of an iteration release or implementation build
including - Unit testing is the white-box technique in which
programmers verify that the software they
implemented satisfies its design requirements
(see Section 6.3) - Regression testing verifies that changes to one
part of an application do not break other
previous working parts/components of the
application,
10Testing Workflow
- Integration testing verifies the communication
and cooperation among major system components
that have been previously unit tested, - User acceptance testing tests whether the users
can actually use the system and the difficulties
they encounter when doing so (often a Cognitive
Psychology issue), - Stress testing ensures the system can handle
significant loads, e.g., 5,000 simultaneous
users. - Performance testing ensures the system meets
performance objectives, e.g., response within 3
seconds. - Evaluate Test evaluate the results of the tests
and the testing strategy.
11Testing Categories
- black-box testing techniques use only functional
knowledge of an entity being tested. E.g. only
knowledge on use cases, or interfaces - white-box testing techniques utilize knowledge of
the internal control structure logic of the
entity being tested. - Behavioral diagrams capture some aspect of the
internal behavior of how an entity realizes the
functionality associated with a use case.
12Models How to Test
- In general, develop use cases that address these
issues - correctness tests both the syntactic and
semantic accuracy of a model. - From a syntactic perspective, the model must be
tested to verify that the various UML elements
used within the model are used correctly. - From a semantic perspective, the elements within
the model must accurately correspond to the
reality being represented by the model.
13Models How to Test
- completeness tests whether a model is complete
by verifying whether any required elements are
missing from the model. This is typically
accomplished by determining whether the model can
appropriately handle scenarios developed for this
purpose. - consistency tests whether the elements in a
model are used in a conflicting way. It also
verifies that the elements of a model dont
contradict with the elements of other models on
which it depends.
14Use Case Model
- The testing of the Use-Case Model must consider
the following issues - Correctness does each use case accurately
represent a requirement? - Completeness do the use cases represent all of
the functionality needed for a satisfactory
product - Consistency generally this requires examining
extension and included use cases to ensure that
their relation to other use cases is consistent.
Naturally, its possible to declare two use cases
that contradict each other, but this is rare in
practice.
15Analysis Model Testing
- Testing the analysis model amounts to determining
whether the application being modeled correctly
interprets the application domain - Correctness the description of the domain
concepts are accurate the algorithms will
produce the expected results. The concepts and
algorithms cover the use cases - Completeness the concepts are sufficient to
cover the scope of the content specified.
Sufficient detail is given to describe concepts
to the required depth. Experts agree with the
attributes and behaviors assigned to each class. - Consistency Model elements should be consistent
with the businesss definitions and meanings.
Where there are multiple ways to represent a
concept of action, those ways should be modeled
equivalently.
16Design Model Testing
- Testing the design model is conceptually similar
to testing the analysis model, but also requires
addressing the following issues - Correctness Each class accurately implements
the semantics of an interface. Classes
corresponding to interfaces must implement the
interface - Completeness classes are defined for each
interface in the architecture. Preconditions for
method use are appropriate specified.
Post-conditions and error conditions are
specified. - Consistency The behaviors in the interface of
each class provides either a single way to
accomplish a task or, of there are multiple ways,
they provide the same behavior, but with
different preconditions
17Design Model Testing
- Testing an object-oriented design presents some
additional challenges not typically found in the
testing of procedure programming designs - Interfaces Classes implement interfaces and its
necessary to ensure that the class correctly
implements the functionality required by its
interface. - Inheritance The use of inheritance also
introduces coupling problems in which a change to
one class results in a change to all of its
subclasses. This requires ensuring that the
change that is now being inherited is appropriate
for all of the subclasses. Delegation can often
alleviate many of the issues associated with
inheritance. - Delegation The delegation of tasks to other
objects, though similar to the use of modules in
procedure programming languages, also presents
its own set of issues. In a class, the delegation
is encapsulated behind a public interface, hence
its necessary to ensure that the implementation
of the interface is still correct when the class
to which the functionality has been delegated is
changed.
18Implementation Model Testing
- Testing the implementation model focuses on
verifying that the implemented code satisfies the
design in the design model (and the requirements
of the use-case model). - Correctness The executable components of the
system correctly build (e.g., compile and deploy)
and adhere to the design model. - Completeness the executable components provide
all of the functionality specified in the design
model. - Consistency The executable components of the
system are appropriately integrated in a way that
provides the functionality specified by the
use-case requirements
19Testing Framework
- The object-oriented concepts you have learned in
this class can easily be used to develop a simple
testing framework for any application you are
building. - The execution of such tests will be automatically
performed every night as part of the most recent
build process.
20Example Test Method
With Method
21Example Test Handler
- Create new class that handles the common testing
- behaviors
- This can be done with Java, C but not C
- Specifies that the execution of the given
message should - result in a returned value equal to the given
object
22verifyTest() - AccountCatalog
- public verifyTest()
- // Get a newly created unique account id
- int id nextId()
- // Create a new account object with this id.
- Account account new Account(id)
- // Save the newly created account object
- save(account)
- //
- Object parameters new Object1
- parameters0 new Integer(id)
- // Assert that the execution of the find
method - // on the current AccountCatalog object
(this) - // with the single id parameter, should
return - // a value equal to the account object. If
not, - // assert will log an appropriate error
message - // to the log file.
23General Testing Methodologies
24Figure 13-3 Test types and detected defects
25Unit Testing
- The process of testing individual methods,
classes, or components before they are integrated
with other software - Two methods for isolated testing of units
- Driver
- Simulates the behavior of a method that sends a
message to the method being tested - Stub
- Simulates the behavior of a method that has not
yet been written
26Unit Testing
module to be tested
results
software engineer
test cases
27Unit Testing
module to be tested
interface
local data structures
boundary conditions
independent paths
error handling paths
test cases
28Unit Test Environment
driver
interface
local data structures
Module
boundary conditions
independent paths
error handling paths
stub
stub
test cases
RESULTS
29Test Cases
- Test cases should uncover errors such as
- Comparison of different data types
- Incorrect logical operators or precedence
- Expectation of equality when precision error
makes equality unlikely - Incorrect comparison of variables
- Improper or nonexistent loop termination
- Failure to exit when divergent iteration is
encountered - Improperly modified loop variables
30Integration Testing
- Evaluates the behavior of a group of methods or
classes - Identifies interface compatibility, unexpected
parameter values or state interaction, and
run-time exceptions - System test
- Integration test of the behavior of an entire
system or independent subsystem - Build and smoke test
- System test performed daily or several times a
week
31Integration Testing Strategies
Options the big bang approach an
incremental construction strategy
32Top Down Integration
A
top module is tested with
stubs
B
F
G
stubs are replaced one at
a time, "depth first"
C
as new modules are integrated,
some subset of tests is re-run
D
E
Main disadvantage -gt is the need for stubs and
attendant testing difficulties that can be
associated with them. Should test major control
functions early to help with this.
33Bottom-Up Integration
A
B
F
G
drivers are replaced one at a
time, "depth first"
C
worker modules are grouped into
builds and integrated
D
E
cluster
Major disadvantage -gt the program as an entity
does not exist until the last module is added.
34Usability Testing
- Determines whether a method, class, subsystem, or
system meets user requirements - Performance test
- Determines whether a system or subsystem can meet
time-based performance criteria - Response time specifies the desired or maximum
allowable time limit for software responses to
queries and updates - Throughput specifies the desired or minimum
number of queries and transactions that must be
processed per minute or hour
35User Acceptance Testing
- Determines whether the system fulfills user
requirements - Involves the end users
- Acceptance testing is a very formal activity in
most development projects
36Object-Oriented Testing
- begins by evaluating the correctness and
consistency of the OOA and OOD models - testing strategy changes
- the concept of the unit broadens due to
encapsulation - integration focuses on classes and their
execution across a thread or in the context of
a usage scenario - validation uses conventional black box methods
- test case design draws on conventional methods,
but also encompasses special features
37Who Tests Software?
- Programmers
- Unit testing
- Testing buddies can test others programmers
code - Users
- Usability and acceptance testing
- Volunteers are frequently used to test beta
versions - Quality assurance personnel
- All testing types except unit and acceptance
- Develop test plans and identify needed changes
38Debugging A Diagnostic Process
39The Debugging Process
test cases
results
new test cases
regression tests
suspected causes
Debugging
corrections
identified causes
40Debugging Effort
time required to diagnose the symptom
and determine the cause
time required to correct the error and
conduct regression tests
41Symptoms Causes
symptom and cause may be
geographically separated
symptom may disappear when
another problem is fixed
cause may be due to a
combination of non-errors
cause may be due to a system
or compiler error
cause may be due to
symptom
assumptions that everyone
cause
believes
symptom may be intermittent
42Consequences of Bugs
infectious
damage
catastrophic
extreme
serious
disturbing
annoying
mild
Bug Type
Bug Categories
function-related bugs,
system-related bugs, data bugs, coding bugs,
design bugs, documentation bugs, standards
violations, etc.
43Debugging Techniques
brute force / testing -- run-time traces,output
statements
backtracking -- start where error is found and
work backwards
induction -- cause elimination, binary
partitioning
deduction
44Debugging Final Thoughts
1.
Don't run off half-cocked,
think
about the
symptom you're seeing.
2.
Use tools
(e.g., dynamic debugger) to gain
more insight.
3.
get help
from someone else.
If at an impasse,
4.
Be absolutely sure to
conduct regression tests
when you do "fix" the bug.
45Configuration and Change Management
- Controls the complexity associated with testing
and supporting a system through multiple
development and operational versions - Integrally related to project management,
implementation, testing, and deployment
activities - Change control procedures are typically developed
in the first iteration before development - Need for formal procedures depends on size and
cohesiveness of project
46Figure 13-7 Configuration and change management
discipline activities
47Versioning
- Alpha version
- Test version that is incomplete but ready for
some level of rigorous integration or usability
testing - Beta
- Test version that is stable enough to be tested
by end users for an extended period of time - Production version
- System version that is formally distributed to
users or made operational for long-term use - Maintenance release
- System update that provides bug fixes and small
changes to existing features
48Submitting Change Requests and Error Reports
- Typical change control procedures include
- Standard change request forms
- Completed by a user or system owner
- Review of requests by a change control committee
- Assess impact on system, security, and budget
- Extensive planning for design and implementation
- Bugs reports are often reported separately
because of the need for an immediate fix
49Figure 13-11 A sample change request form
50Figure 13-12 A sample change review form
51Planning and Managing Testing
- Testing activities must be distributed throughout
the project - Unit and integration testing occur whenever
software is developed, acquired, or combined with
other software - Usability testing occurs whenever requirements or
design decisions need to be evaluated - User acceptance tests are conducted as a final
validation of the requirements, design, and
implementation activities
52Development Order
- Input, process, output (IPO) development
- Implements input modules first, process modules
next, and output modules last - Important user interfaces are developed early
- Top-down
- Implements top-level modules first
- There is always a working version of the program
- Bottom-up
- Implements low-level detailed modules first
- Programmers can be put to work immediately
53Framework Development
- Foundation classes
- Object framework that covers most or all of the
domain and data access layer classes - Reused in many parts of the systems and across
applications - Whenever possible, developers choose use cases
for early iterations that rely on many foundation
classes - Testing early finds bugs before dependent code is
developed
54Direct Deployment
- Installs a new system, quickly makes it
operational, and immediately turns off any
overlapping systems - Advantages
- Simplicity
- Disadvantages
- Risk of system unavailability
- Used when a new system is not replacing an old
system and/or downtime can be tolerated
55Direct Deployment and Cutover
56Parallel Deployment
- Operates both old and new systems for an extended
time period - Advantages
- Relatively low risk of system failure
- Disadvantage
- Cost to operate both systems
- Used for mission-critical applications
- Partial parallel deployment can be implemented
with increased risk of undetected errors
57 Figure 13-24 Parallel deployment
and operation
58Phased Deployment
- Installs a new system and makes it operational in
a series of steps or phases - Advantages
- Reduced risk
- Disadvantages
- Increased complexity
- Useful when a system is large, complex, and
composed of relatively independent subsystems
59 Figure 13-25 Phased deployment with direct
cutover and parallel operation
60Personnel Issues
- New system deployment places significant demands
on personnel - Temporary and contract personnel may be hired to
increase manpower, especially during a parallel
deployment - System operators
- Personnel with experience in hardware or software
deployment and configuration - Employee productivity decreases temporarily with
a new system due to the learning curve