Performance Analysis of Software Architectures - PowerPoint PPT Presentation

About This Presentation
Title:

Performance Analysis of Software Architectures

Description:

Performance Analysis of Software Architectures Paola Inverardi UNIVERSIT DEGLI STUDI DELL AQUILA Area Informatica, Facolt di SS.MM.NN. http://saladin.dm.univaq.it – PowerPoint PPT presentation

Number of Views:246
Avg rating:3.0/5.0
Slides: 53
Provided by: xyz89
Learn more at: https://isr.uci.edu
Category:

less

Transcript and Presenter's Notes

Title: Performance Analysis of Software Architectures


1
Performance Analysis of Software Architectures
  • Paola Inverardi

UNIVERSITÀ DEGLI STUDI DELLAQUILA Area
Informatica, Facoltà di SS.MM.NN.
http//saladin.dm.univaq.it
2
Joint work with
  • Simonetta Balsamo, Universita di Venezia
  • Group of students over the years Mangano, Russo,
    Aquilani, Andolfi

3
Goal
  • quantitative analysis of SA descriptions.
  • Introduce the ability to measure architectural
    choices.
  • Why? and
    How?

4
Why ?
  • To validate SA design choices with respect to
    performance indices
  • To compare alternative SA designs .

Produce feedback at the design level
5
HOW?
How ?
  • Introduce quantitative models early in the life
    cycle
  • Evaluate performance indices

Add non-functional requirements to maintain the
expected performance
6
Outline of the Talk
  • Software Architectures
  • Performance Evaluation
  • Approaches
  • Our recipe
  • Conclusions
  • References
  • Advertising

7
Software Architectures
  • High level system description in terms of
    subsystems (components) and the way they
    interact (connectors)
  • Static description Topology
  • Dynamic description Behavior

8
Topology
9
Behavior
Finite State Automata
MSC
10
Static and Dynamic Views
(a) FSA, (b) Topology, (c) MSC
11
Quality Attributes and SA
  • qualities discernable by observing the system
    execution performance, security,availability,
    functionality, usability
  • qualities not discernable at run time
    modifiability, portability, reusability,
    integrability, testability.

12
Quality Attributes at run time
  • Performance refers to the responsiveness of the
    system. It is often a function of how much
    communication and interaction there is between
    components of the system. It is clearly an
    architectural issue. (communications usually take
    longer than computations)

13
How to measure Performance
  • Arrival rates and distributions of service
    requests, processing times, queue sizes and
    latency (the rate at which requests are serviced)
  • simulate by building a stochastic queueing model
    of the system based upon anticipated workload
    scenarios

14
Software Architectures Quality Attributes
  • Static can be measured statically (portability,
    scalability, reusability, )
  • Dynamic can be measured by observing the
  • SA behavior (performance, availability, )

15
Software Architectures and Performance
  • Quoting from WOSP 2000 panel introduction on
    Performance of SA

the quantitative analysis of a SA allows for the
early detection of potential performance problems
Early detection of potential performance
problems allows alternative software designs and
meaning designing a software system and
analyzing its performance before the system is
implemented
16
Software Architecture
Level of abstraction
Dynamic model
Lack of information
How do we measure
How do we interpret the measures?
17
Performance Evaluation
  • Quantitative analysis of systems based on
    models and methods both deterministic and
    stochastic

Evaluate the performance of a system means make a
quantitative analysis to derive a set of
(performance) indices either obtained as mean or
probabilistic figures
Probabilistic distribution/mean of response
times, of waiting times,queus length, delay,
resource utilization, throughput,
18
PE Models and Techniques
  • Models are primarily stochastic and can be solved
    by either analytic or simulation techniques.
  • Analytic techniques can be exact (e.g.
    numerical), approximated or bound
  • Simulation techniques , more general but
    expensive

19
Queueing Network Models
  • Service centers
  • service time
  • buffer space with scheduling policy
  • number of servers
  • Customers
  • Number for closed models, arrival process for
    open models
  • Network Topology
  • models how service centers are interconnected
    and how customers move among them

20
Queueing networks with finite capacity queues
  • Queueing network models to represent
  • sharing of resources with finite capacity queues
  • population constraints
  • synchronization constraints
  • finite capacity of the queue
  • n number of customers in the service center
  • B finite capacity
  • blocking ? dependence
  • Deadlock
  • Solution Methods exact vs approximate
    simulation
  • various blocking types
  • different behaviors of customer arrivals at a
    full node and of servers' activity

21
Analytical solutions for Q.N. with finite
capacity queues
  • Network model parameters
  • M number of nodes
  • N number of customers
  • µi service rate of node i
  • Service time distribution M, G, PHn , GE
  • Ppij routing matrix
  • Bi finite capacity of node i
  • Queue-length probability distribution ?
  • C-T Homogeneous Markov Chain
  • S (S1,S2,..., SM) network state
  • State space E, transition rate matrix Q
  • Steady-state probabilities p(S)

Other average performance indices can be derived
from p and depend on the blocking type Exact
solution becomes soon numerically
untractable Product-form solution in special
cases approximate analysis
22
Queueing Network Models
  • QNModelling is a top-down process. The
    underlying philosophy is to begin by identifying
    the principal components of the system and the
    ways they interact, then supply any details that
    prove to be necessary

(ref. Lazowska et al. Quantitative System
Performance, Prentice Hall, http//www.cs.washing
ton.edu/homes/lazowska/qsp/)
23
QNM creation
  • Definition definition of service centers, their
    number, class of customers and topology
  • Parameterization define the alternative of
    studies, e.g. by selecting arrival processes and
    service rates
  • Evaluation obtain a quantitative description of
    system behavior. Computation of performance
    indices like resource utilization, system
    throughput and customer response time.

24
Approaches
  • Software Performance the whole system life cycle
    is available, design is used to incrementally
    produce a QNM model of the software system.
  • Software Specification the system behavioral
    specification is available and modeled by
    Stocastic Petri Nets, Stocastic Process Algebras

25
Software Performance
  • Performance Analysis integrated in the software
    life cycle.
  • Assume to manage a number of software artifacts,
    from requirements specifications (Use Cases) to
    deployment diagrams
  • QNM models
  • Topology obtained from the information on the
    physical architecture
  • Information on software component is used to
    define the model workload

References under SP, a (UML-based) survey in BS01
26
Software Specification
  • Identify a precise software stage system design
    specification
  • Formal behavioral specification Stochastic petri
    Nets, Stochastically Timed Process Algebras
  • Behavioral and performance analysis in a single
    model

References under SS
27
Our Approach
  • No SP we want to evaluate performance of the SA
    description. We do not assume to have an
    implementation
  • No SS nice one single model but feedback too
    difficult. The performance model is too far from
    the component/connectors description

28
SA Description
CHAM, FSP,UML WRIGHT,...
Behavioral Model
Dynamic descriptions, FSM,MSCs,...
Algorithm
feedback
Performance Model
Favorite model QNM,SPN,SPA...
Performance Evaluation
Solution method symbolic, approximation, simulati
on...
Results and interpretation
29
Brief history of our work in SA and PE 1/2
  • Formal description of SA via CHAM
  • Behavioral analysis of the SA
  • Algebraic analysis and finite state modeling
  • Validation and quantitative analysis based on
    FSTM
  • global system behavior
  • Queueing Network Model
  • Feedback at the design level
  • Capacity planning and case studies

- S. Balsamo, P. Inverardi, C. Mangano "An
Approach to Performance Evaluation of Software
Architectures" in IEEE Proc. WOSP'98. - S.
Balsamo, P. Inverardi, C. Mangano, L.Russo
"Performance Evaluation of Software
Architectures" in IEEE Proc. IWSSD-98.
30
Brief history of our work in SA and PE 2/2
  • Specification of SA via Message Sequence Charts
    - UML
  • Event ordering. Event sequence. Trace of events.
  • Communication types, concurrency and
    non-determinism
  • Trace analysis and model structure identification
  • Quantitative analysis based on extended QN model
  • Scenarios for model parameterization
  • Feedback at the design level

- F. Andolfi, F. Aquilani, S. Balsamo, P.
Inverardi "Deriving Performance Models of
Software Architectures from Message Sequence
Charts" in Proc. IEEE WOSP 2000. - F. Andolfi,
F. Aquilani, S. Balsamo, P. Inverardi " On using
Queueing Network Models with finite capacity
queues for Software Architectures performance
prediction in Proc. QNET2000.
31
Framework of performance analysis of SA at the
design level
  • Description of SA via LTS - independent of ADL
  • Algorithm to derive the performance model
    structure
  • Add info on the communication types, state
    annotation
  • Identify scenarios for model parameterization
  • Performance model based on extended Queueing
    Network models
  • Analytical solution (symbolic) for simple models,
    approximation or simulation for complex models
  • Result interpretation at the software design level

F. Aquilani, S. Balsamo, P. Inverardi
"Performance Analysis at the software
architecture design level" TR-SAL-32, Technical
Report Saladin Project, 2000, to appear on
Performance Evaluation.
32
Usual Example The Multiphase Compiler
Multiphase compiler concurrent architecture
optimized architecture
Software Architecture
P
AR
SE
R
S
E
M
AN
T
OR
L
EX
E
R
p
hr
a
s
e
s
c
h
a
r
ac
t
er
s
t
o
k
e
n
s
c
o
r
.
p
hr
a
s
e
s
T
EX
T
CO
D
E_G
E
N
Synchronous communication
Queueing Network Model with BAS blocking
O
B
1
O
S
L
P
G
B
1
B
1
B
1
S
P
G
Acyclic topology
Solution approximate analysis
33
Sequential Architecture
Software Architecture
Same number of components


strongly sequentialized. No concurrency
parser
lexer
semantor
optimizer
text
codegen
One sin
QNM
sc1
1 single service center
34
Parameterization and Evaluation
  • Specify parameters (e.g. arrival rate and mean
    service time of each center). We keep them
    symbolic.
  • Meaning of the parameters, (e.g. service time
    execution time of a component, arrival rate
    activation of concurrent instances of components
    execution.
  • Parameter istantiations identify potential
    implementation scenarious
  • In the compiler example, 3 scenarious playing
    with the mean service time of the concurrent model

35
How we provide Feedback
  • throughput of the 2 compiler SA
  • the concurrent SA performs 5 times better than
    the sequential SA

Scenario in which the mean service times of the
nodes have the same degree of magnitude.
  • enrich performance requirements in the
    subsequent development steps,
  • a global performance requirement can be broken
    into requirements on single components

36
SA
Dynamic description
Labeled Transition System Message Sequence Charts
State annotation, Communication type
Algorithm
feedback
Performance Model- QNM
Scenarios parameterization
Performance Evaluation
Choice of SA new requirements on components,
connectors
Results and interpretation
37
Performance Analysis at the SA design level
1/2
  • SA specification Labeled Transition System
  • ltS,L, , s,Pgt,
  • S set of states, L set of labels (communication
    types)
  • s initial state, P set of state labels
  • transition relation in (P x L x P)
  • SA components communicating concurrent
    subsystems
  • SA level consider interaction activities among
    components
  • Parallel composition of communicating components
  • P set of SA components and connectors states
    described by the LTS

38
Performance Analysis at the SA design level
2/2
  • First model the maximum level of concurrency
    (each component as an autonomous server)
  • (algorithm)
  • derive a simple structure of the QNM by analyzing
    the true level of concurrency and the
    communication type

39
Algorithm
  • LTS visit to derive interaction sets formed by
    interaction pairs (IP) - (p1 ,p2 ) flow of
    data from p1 to p2
  • model connecting elements with buffer
  • mark non-deterministic IP
  • examines the sets of IP to generate the service
    centers and topology of the QNM

40
SA description MSCs - From MSCs to QNMs
  • UML as ADL
  • a model of all possible system behaviours
  • state diagrams for manageable processes
  • implicit parallel notation for composite
    processes P1P2Pn
  • no explicit representation due to state explosion
  • Sequence diagrams/MSCs to describe components
    interactions
  • MSCs with state information and iteration blocks,
    components are the object elements
  • QNM with blocking, BAS mechanism

41
MSCs requirements
  • It is always possible to synthesize a FSM out of
    a set of MSCs
  • all refer to the same initial system
    configuration
  • representative of major system behaviors
  • Each system component is in (at least) a MSC
  • MSCs contain info about the state of components
  • Other technical conditions

42
Extracting from MSC info about
  • communication among components, i.e. which
    components interact
  • communication types, i.e. synchronous/asynchronous
  • concurrency, i.e. components can proceed
    concurrently
  • non-determinism, i.e. components do proceed
    nondeterministically

43
How do we do that?
  • MSCs encoding gt from a MSC we derive the trace
    (set of regular languages)
  • We analyze traces to identify the kind of
    communications (1to2, 2to1, concurrent,
    non-deterministic), we build Interaction Pairs to
    record this information
  • We use IP to build the QNM topology

44
Interaction Pairs and QNM
  • I (P1,P2)s gt service center representing a
    unique service P1 followed by P2, expressing
    sequentiality (P1 and P2 are not concurrent)
  • I (P1,P2)a gt service center with infinite
    buffer implicitely modelling the communication
    channel the transition P1 -gtP2 in the QNM
  • (P1,P2)s, (P1,P3)s ND gt multi-customer
    service center
  • synchronous communication among concurrent
    components gtdistinct service centers, the
    receiver component a zero capacity buffer with
    BAS policy in the sender component

45
Example
  • Compressing Proxy system
  • purpose improve the performance of Unix-based
    World Wide Web browsers over slow networks by an
    HTTP server that compresses and uncompresses data
    to and from the network
  • Software Architecture

gzip
Filter
Pseudo Filter Adapter
Filter
Synchronous communication Queueing Network Model
with BAS blocking exact analysis of the
underlying Markov chain
BGZIP1
BAD1
GZIP
AD
46
MSC to trace
S(Cfu,AD)S(AD,Gzip)S(Gzip,AD)S(AD,CFd)N
47
Trace Analysis
  • S(Px,Py)c1S(Pk,Pz)c2 S(Pi,Pj)c3 S(Ps,Pt)c4
  • S(Px,Py)c1S(Pk,Pz)c2 S(Ps,Pt)c4 S(Pi,Pj)c3
  • Pi and Ps are concurrent

S(Ps,Pt)c4
S(Px,Py)c1S(Pk,Pz)c2
S(Pi,Pj)c3
S(Pi,Pj)c3
S(Ps,Pt)c4
48
Conclusion
  • Derivation of the performance model from the
    dynamic view of SA
  • Finite (incomplete) representation of the SA
    behavior, i.e. LTS (MSC)
  • Analysis of LTS (MSC) to extract relevant to PM
    pieces of information
  • Performance evaluation at the SA level of
    abstraction
  • Feedback on the design process
  • Case studies
  • Integration of architectural design tools and
    performance tools

49
My opinion
  • Still active area of research, very high
    industrial interest, research interest see key
    action of the new IST European program call
  • PM models close to SA description. Symbolic
    evaluation!
  • Feedback Make explicit the extra info to help in
    refining the design steps
  • Experiment!

50
ADVERTISING
ROME 22-26 JULY 2002
ISSTA and WOSP Together!
51
Selected Bibliography
  • GENERAL SA
  • Shaw, M., Garlan, D., Software Architectures
    Perspectives on an Emerging Discipline, Prentice
    Hall, 1996
  • Bass, L., Clemens, P., Kazman, R., Software
    Architectures in Practice, Addison Wesley, 1998
  • Hofmeister, C., Nord, R., Soni, D., Applied
    Software Architectures, Addison Wesley, October
    1999.
  • Http//www.sei.cmu.edu
  • Survey
  • S. Balsamo, M. Simeoni "On Transforming UML
    models into performance models" Workshop on
    Transformations in UML, ETAPS 2001 Genova, Italy,
    April 7th, 2001.
  • Software Specification
  • G. Balbo, G. Conte and M. A. Marsan. Performance
    Models of Multiprocessor Systems. Series in
    Computer Systems, The MIT Press, (1986).
  • R. Pooley and P. King, "Using UML to derive
    stochastic process algebra models Proceedings
    15th UK Performance Engineering Workshop, 1999. 
  • P. King and R. Pooley, "Derivation of Petri Net
    Performance Models from UML specifications
    Proceedings 11th Int. Conf. on Tools and
    techniques for computer Performance Evaluation,
    Illinois 2000.
  • M. Bernardo and R. Gorrieri "Extend Markovian
    Process Algebra" In Proc. CONCUR '96, LNCS
    (Springer-Verlag) No. 1119, (1996) 315-330.
  • M. Bernardo, P. Ciancarini and L. Donatiello,
    "AEMPA A Process Algebraic Description Language
    for the Performance Analysis of Software
    Architectures", in Wosp2000
  • Our Approach
  • S. Balsamo, P. Inverardi, C. Mangano "An Approach
    to Performance Evaluation of Software
    Architectures" in IEEE Proc. WOSP'98.
  • S. Balsamo, P. Inverardi, C. Mangano, L.Russo
    "Performance Evaluation of Software
    Architectures" in IEEE Proc. IWSSD-98.
  • F. Andolfi, F. Aquilani, S. Balsamo, P. Inverardi
    "Deriving Performance Models of Software
    Architectures from Message Sequence Charts" in
    Wosp2000
  • F. Andolfi, F. Aquilani, S. Balsamo, P. Inverardi
    " On using Queueing Network Models with finite
    capacity queues for Software Architectures
    performance prediction in Proc. QNET2000.
  • F. Aquilani, S. Balsamo, P. Inverardi
    "Performance Analysis at the software
    architecture design level" TR-SAL-32, Technical
    Report Saladin Project, 2000, to appear on
    Performance Evaluation

52
Selected bibliography
  • Software Performance
  • H. Gomaa and D. Menasce, "A Method for Design and
    performance modeling of client-server Systems",
    IEEE Transactions on Software Engineering, 2000.
  •  H. Gomaa and D. Menasce, "Design and Performance
    Modeling of component Interconnection Patterns
    for Distributed Software Architectures" in
    Wosp2000.
  • V. Cortellessa, R. Mirandola, "Deriving a
    Queueing network based performance Model from UML
    Diagrams in Wosp2000
  • D. C. Petriu, X. Wang "From UML Description of
    high-level software architecture to LQN
    Performance Models", in AGTIVE'99, LNCS 1779,
    Springer-verlag, 2000. 
  • D. C. Petriu, C. Shousha, A. Jalnapurkar,
    "Architecture-based Performance Analysis Applied
    to a Telecommunication System", in IEEE Trans. of
    Software Engineering, 2000.
  • R. Pooley, "Software Engineering and Performance
    A Road-map in The Future of Software
    Engineering, A. Finkelstein Editor, 22 ICSE.
  •  R. Pooley and P. King, "The Unified Modeling
    Language and Performance Engineering IEE
    Proceedings-Software, 146, 1 (February 1999).
  • C. U. Smith. Performance Engineering of
    Software Systems. Addison-Wesley Publishing
    Company, (1990).
  • C. U. Smith and L. G. Williams "Software
    Performance Engineering A Case Study Including
    Performance Comparison with Design Alternatives"
    IEEE Trans. on Software Engineering, Vol 19, No.
    7, 720-741, July 1993. 
  • C. U. Smith and L. G. Williams "Performance
    Evaluation of Software Architectures in Wosp
    1998
  • M. Woodside, C. Hrischuk, B. Selic, S. Bayarov,
    "A Wideband Approach to integrating Performance
    prediction into a Software Design Environment",
    in Wosp 1998.
  • M. Woodside " Software Performance Evaluation by
    Models", in Performance Evaluation (G. Haring, C.
    Lindemann, M. Reiser Eds.), LNCS 1769, 283-304,
    2000.
Write a Comment
User Comments (0)
About PowerShow.com