Title: Performance Test Planning Part 1
1Performance Test Planning Part 1
CS 4803 EPR
Enterprise Computing Performance
2Lecture Overview
- Performance Test Planning
- Terms and definitions
- Objectives
- Test scope
- Performance Requirements
- Who develops them?
- Suggested elicitation process
- Project and political realities
- Presentation to Business Analysts
- Test environment
- Review of performance simulation market tools
- Case Studies and White Papers
- Student Presentations 1
3Terms and Definitions
- Transaction Logical business operation
- Key term
- Different everywhere
- Can have a narrow or broad context
- Context generally limited to middleware and
database tiers
4Terms and Definitions
- Performance Test The catch all for all
performance related types of testing. Often used
in different ways by different people. Term can
refer to any type of performance-related test. - Load Test A test of a computer system and its
applications by running under a full complement
(full load) of transactions or users. Most
commonly refers to tests that model varying loads
and activities that the application is expected
to encounter when delivered to real users.
Focuses on how much the application handle. - Stress Test Tests application under a load for
a period of time to discover the ability of the
application to handle workload. No standard
definition. Most commonly refers to tests that
model varying loads and activities under more
stressful conditions than the application is
expected to encounter when delivered to real
users. Sub categories may include - Spike testing (short burst of extreme load)
- Peak load testing (load test with maximum
concurrent users) - Volume Test Any test focused on how much
instead of how fast. Commonly related to
database testing. The distinction between
"volume" and "load" is that volume focuses on
high volume and does not need to represent "real"
usage. - Performance Bottleneck An activity or area that
slows down progress such that the next activity
must wait on the bottleneck to clear. The
critical bottleneck is the single bottleneck
that will keep the overall process from getting
faster, no matter what other bottlenecks are
resolved. Also called the Critical Path
Bottleneck.
5Terms and Definitions
- Workload Distribution a transaction mix of the
functions performed by a user community on a
system. There may be more than one user
community. For example, during the course of a
day on a retail-based website, most users are
shopping, some are searching for a specific
product, some are finalizing purchases and
checking out, while a single administrator may be
updating product prices. A workload
distribution is based on a percentage of users
performing a specific function over a given
period of time. Using the above example a
workload distribution could be
6Terms and Definitions
- Concurrent Users Users that are simultaneously
accessing the same system. A number associated to
this term is often misleading many factors come
into play. - Ramp-up/Ramp-down Simply the time between the
first user access and the time the target load is
achieved, as illustrated below.
7Terms and Definitions
- Scalability The expansion capability of a
system or architecture. The term by itself
implies a positive capability. - Vertical Scaling Adding system resources
- Horizontal Scaling Adding nodes or autonomous
systems - Linear Scalability The holy grail of system
architecture Not fully achievable
8Terms and Definitions
- Longevity or Stability Testing Testing designed
to validate that performance degradation over
time is not realized. - Performance Degradation The condition a system
suffers when system resource levels are bled. - Think Time The average amount of time a user
would spend between doing something. For example,
the amount of time a user spends filling out a
form before clicking on the Save button. - End User Experience Performance as perceived by
a human user. For example, the amount of elapsed
time between a button click on Submit Order and
the display of the Thank You page. - Response Time Amount of elapsed time between a
server request and server response. For example,
the elapsed time delta of an HTTP request header
timestamp and the corresponding HTTP response
header timestamp.
9Terms and Definitions
- Virtual User A simulated, computer-generated
user. Virtual users are a concept used by
performance simulation tools to generate
realistic system transactions on the SUT. - SUT System Under Test aka AUT Application
Under Test. - TPS Transactions Per Second. This metric is
often used to measure the performance of
components without user-interaction. - Batch Processes Also termed back-end-processes
. Any system process or program that is triggered
by a non-human initiated event, such as a system
clock or computational trigger. - Business Process AKA Scenario. Lacks a
standard definition. Typically refers to a suite
of human-initiated business transactions (units
of work), grouped together to complete a specific
task. - Test Case Shares a one-to-one or one-to-many
relationship with Business Process. - Performance Baseline AKA performance
profiling A technique that simulates a single
user under steady/controlled conditions, used to
establish a best-case benchmark.
10Performance Plan Sections
- The next 7 slides describe the key Performance
Test Plan sections
11Test Objectives
- Clearly state your high-level test objectives,
from a business perspective For example
The objective of this document is to be a single
point of reference for all LP S2S Performance
Validation planning. It will serve as a
communication tool and reference between the
performance validation team and the managers of
the project/system undergoing performance
validation. This test plan will serve the
following, specific purposes To ensure that the
new System-System design is able to handle the
projected LP volume levels, and to ensure current
LP Classic, LP Inside and LP.com browser
application performance are not impacted by the
new Pricing service, and to verify the
established LP performance has not been
negatively impacted by the introduction of
System-System design.
12Scope
- It is imperative to clearly delineate what is in
and what is specifically out of scope in context
for the test. For example
- This test plan encompasses the following LP
functional areas. All other functional areas are
explicitly not in scope for this test. - LoanProspector.com
- Volume
- Direct Entry
- Loan Search
- Import
- LP Post Close
- File Format (TBD)
13Performance Requirements
- Non-functional, performance-specific requirements
must be gathered, clearly documented and
agreed-upon before further test planning. Some
examples of non-ambiguous, testable performance
requirements
- 80 of all page responses shall be lt 5 seconds
- 100 of all page responses shall be lt 8 seconds
- XYZ Batch Process shall complete in 8 hours
- ABC Batch Process shall yield 2,500 TPS under
nominal load - AUT shall support 500 concurrent users under
nominal load - AUT shall support 1,000 concurrent users under
peak load - All performance requirements shall be met with a
1 year data volume load
14Performance Requirements
- Elicit requirements from Business Analysts
- Ask the right questions
- Education issue lack of experience in the
business community - Conduct a training class teach them
- Without unambiguous, explicit requirements, no
Exit Criteria can be met! - Some BAs just dont want to bother or be
responsible for something new - Some examples
- Please define the user types (actors) for this
application/component. - Nominally, how many concurrent users are
anticipated for this release? Peak? Please break
down per application component if applicable. - Nominally, how many concurrent users are
anticipated six months into production? Twelve
months? - What type of User Experience do you require?
Response times Please be specific. - How many total users will the application
support? (total user base) - What is the maximum database size(s) for the
target system? - How much bandwidth will the application be
allotted? (WAN, LAN, etc)
15Test Environment
- Production Architecture
- If it doesnt exist, develop it
16Test Environment
- Test Environment
- A mirror or, most likely, a subset of Production
17Tools and Market
- A review of performance test tools and the tools
job market
18Performance Test Tools
- Simulation tools (LoadRunner, SilkPerformer,
QALoad, Rational/VU, etc) - Packet capture/sniffing tools (Ethereal,
TracePlus, Network Packet Analyzer, EtherPeek,
etc) - Monitoring tools (SiteScope, Topaz, FarSight Web,
SilkCentral, Vantage, etc) - Diagnostic tools (DeepDiagnostics, OneSight Web,
PerformaSure, etc)
19Test Simulation Market
20Simulation - tools
- 6 Leading Tools
- LoadRunner (Mercury)
- SilkPerformer (Segue)
- QALoad (CompuWare)
- Rational Robot/VU (IBM)
- Empirix
- RadView
21Simulation - market
- 2004 Estimated over 850M market
- Mercury now has gt 80 of world performance tool
market
22Performance Test - market
23Performance Testing Jobs
24Performance Jobs - Modeling
25Performance Jobs - Combo
26Performance Jobs - Simulation