Model Driven Techniques for Evaluating QoS of Middleware Configurations PowerPoint PPT Presentation

presentation player overlay
About This Presentation
Transcript and Presenter's Notes

Title: Model Driven Techniques for Evaluating QoS of Middleware Configurations


1
Model Driven Techniques for Evaluating QoS of
Middleware Configurations
Arvind S. Krishna, Emre Turkay Andy Gokhale,
Douglas C. Schmidt Institute for Software
Integrated Systems (ISIS) Vanderbilt
University Nashville, TN 37203
Real-time Application Symposium (RTAS 2005) San
Francisco, California
2
Presentation Summary
  • Component middleware technologies
  • Focus on business logic
  • Automates the plumbing code to configure deploy
    middleware
  • Component encapsulate business logic
  • Difficulty in provisioning deploying
  • Error prone task of handcrafting XML
  • Model Driven Generative Technologies (MDD)
  • Focus is on
  • Modeling System composition technique
  • Validating Correct by construction
  • Generating Deployment, configuration info
  • multiple layers of middleware
  • Supports configuring, provisioning, deploying
    quality of Service (QoS)-enabled middleware

This presentation addresses key configuration
QoS evaluation challenges of middleware for DRE
applications
3
Motivating DRE Application
  • Robot Assembly Application
  • Human Machine Interface (HMI) Component human
    accepts/rejects watch
  • Management Work Instructions (MWI) Component
    decide what action to perform on the watch, e.g.
    set the appropriate time
  • Watch Setting Manager (WSM) Component Executes
    action on every watch
  • Palette Conveyor Manager (PCM) Component Watch
    Assembly line that moves watches from source to
    destination
  • Robot Manager Component Robotic Arm that moves
    the watches
  • Goal
  • Increase number of items processed by minimizing
    end-to-end latency

4
Robot Assembly Challenges (1/2)
  • Configuration Challenges
  • Map component level features requirements to
    middleware configurations
  • WSM component interacts with HMI Pallet Manager
    Component
  • Configuring component properties
  • Configuring package properties
  • Configuring underlying middleware

Hook for the request demuxing strategy
Hook for marshaling strategy
Hook for the event demuxing strategy
Hook for the concurrency strategy
Hook for the connection management strategy
Hook for the underlying transport strategy
5
Robot Assembly Challenges (2/2)
  • Configuration Evaluation Challenges
  • How do we make sure chosen middleware
    configurations lead to overall goal of the system
  • Minimizing end-to-end latency of the overall
    system
  • What configuration of middleware hosting HMI
    WSM components lead to best end-to-end latency

6
Research Challenges
Ensuring syntactically semantically valid
middleware configurations
Understanding consequences of deployment
decisions on overall QoS
Alleviating accidental complexities in
evaluating/ benchmarking QoS
www.dre.vanderbilt.edu/cosmic
7
Generic Modeling Environment (GME)
www.isis.vanderbilt.edu/Projects/gme/default.htm
  • Tool Developer (Metamodeler)
  • GME used to develop a domain-specific graphical
    modeling environment
  • Define syntax, static semantics, visualization
    of the environment
  • Semantics implemented via interpreters
  • Application Developer (Modeler)
  • Uses a specific modeling environment (created by
    metamodeler w/GME) to build applications
  • The interpreter produces something useful from
    the models
  • e.g., code, simulations, configurations

8
Resolving Configuration Challenges (1/2)
  • Context
  • Different middleware implementations provide
    different configuration mechanisms to configure
    the middleware
  • CIAO provides service configuration options to
    tune middleware performance
  • www.dre.vanderbilt.edu/ CIAO.html
  • Problem
  • This approach is error prone since
  • Need to know the syntax
  • Need to remember names of strategies
  • Need to know compatible strategies

9
Resolving Configuration Challenges (2/2)
  • Solution
  • Developed a domain-specific modeling language for
    TAO/CIAO called Options Configuration Modeling
    Language (OCML)
  • OCML is used by
  • Middleware developer to design the configuration
    model
  • Application developer to configure the middleware
    for a specific application
  • OCML metamodel is platform-independent
  • OCML models are platform-specific
  • Generates a Wizard to set configuration options
    and provides documentation for each option
  • OCML ensures syntactic semantic validity of
    middleware configurations
  • Detect error at model construction time

10
Resolving Evaluation Challenges (1/3)
  • Context
  • Component integrators must make appropriate
    deployment decisions, including identifying the
    entities (e.g., CPUs) of the target environment
    where the packages will be deployed

Pallet Conveyor Manager
Human Machine Interface
Watch Setting Manager
How do we simulate load background load for
benchmarking?
How do we measure monitor QoS for a given
deployment
Robot Manager
Problem How to ensure a particular deployment
configuration meets QoS requirements
How do we measure monitor QoS for a given
deployment
11
Resolving Evaluation Challenges (2/3)
  • Solution
  • Provide a model-driven tool-suite to empirically
    evaluate refine configurations to maximize
    application QoS
  • BGML Workflow
  • End-user composes the scenario in the BGML
    modeling paradigm
  • Associate QoS properties with this scenario, such
    as latency, throughput or jitter
  • Synthesize the appropriate test code to run the
    experiment measure the QoS
  • Feed-back metrics into models to verify if system
    meets appropriate QoS at design time
  • The tool enables synthesis of all the scaffolding
    code required to set up, run, tear-down the
    experiment
  • Using BGML it is possible to synthesize
  • Benchmarking code
  • Component implementation code
  • Build Component IDL files

12
Resolving Evaluation Challenges (2/3)
template lttypename Tgt void Benchmark_AcceptWorkOrd
erResponseltTgtsvc (void)
ACE_Sample_History history (5000)
ACE_hrtime_t test_start ACE_OSgethrtime ()
ACE_UINT32 gsf ACE_High_Res_Timerglobal_scale
_factor () for (i 0 i lt 5000 i)
ACE_hrtime_t start ACE_OSgethrtime ()
(void) this-gtremote_ref_-gt
AcceptWorkOrderResponse (arg0, arg1)
ACE_CHECK ACE_hrtime_t now
ACE_OSgethrtime () history.sample (now -
start)
  • BGML allows actual composition of target
    interaction scenario, auto-generates benchmarking
    code
  • Each configuration option can then be tested to
    identify the configuration that maximizes the QoS
    for the scenario
  • These empirically refined configurations can be
    reused across applications that have similar/same
    application domains
  • These configurations can be viewed as
    Configuration Customization (CC) patterns

13
Need for Tool Integration (MDD Process) (1/2)
  • Problem
  • Using each tool in isolation does not provide
    complete information
  • OCML does not know about performance
  • BGML does not know what the configuration is
  • Context
  • OCML tool resolves accidental complexity in
    configuring components
  • BGML tool resolves accidental complexity in
    evaluating QoS

OCML ? Correct Configuration
BGML ? Measures critical flow path latency
14
Need for Tool Integration (MDD Process) (2/2)
  • Solution ? MDD Process
  • MDD Process leveraging PICML, OCML BGML
  • PICML ? interaction scenario, Deployment
    Component configuration
  • OCML ? Model middleware hosting individual
    Components
  • BGML ? Capture Evaluation Concerns

Least latency
  • Apply MDD process to DRE application scenario to
    answer
  • How does Middleware Configuration affect QoS?
  • How do Deployment decisions affect QoS?

Candidate configuration (s)
15
MDD Process (1/3)
  • Step 1 PICML Tool
  • PICML used to generate deployment plan
    information

Mapping
Virtual nodes
Process Collocation
  • Step 2 Middleware Configuration
  • OCML associated with Implementation Artifacts
  • OCML provides a wizard with documentation to
    configure the artifacts
  • Configuration of middleware that hosts the
    executors a.k.a Servants in CORBA 2.0

Artifact
Option selection
Documentation Pane
16
MDD Process (2/3)
  • Step 2 ? Choosing Configurations
  • How best to configure middleware hosting HMI and
    WSM components to minimize end-to-end latency
  • Component roles
  • Display component pure client
  • Watch Manager component peer role does not
    need concurrency
  • For each component (Display) narrow down selected
    configurations
  • Fixed part determined a priori
  • Dynamic cannot determine without testing

Configuration Space
HMI Component
WSM Component
  • Step 3 ? Capturing QoS concerns
  • Profile Generate Multiple work-orders exchanged
    between Watch Manager Component and Human for
    Acceptance/Rejection
  • Use Timers to measure end-to-end critical path
    latency in the scenario
  • Same code can be used to evaluate different
    combinations of configurations

17
MDD Process (3/3)
Load generator for the accept operation
Time-stamp send receive
Solution
  • Workspace Glue Generation
  • Create workspace and projects to generate build
    files for the scenario
  • To enact a scenario, this process automates
  • Deployment Plan XML deployment information
  • svc.conf Configuration for each component
    implementation
  • Benchmark code source code for executing
    benchmarks
  • IDL CIDL files
  • Build Files MPC files (www.ociweb.com)

Projects having artifacts
workspace RobotManager
WatchSettingManager PalletteConveyorManager
HumanMachineInterface ManagementWorkInstruction
s
18
Experimental Results / Highlights (1/3)
Automation / Code Generation
DRE Experimental Scenarios Total Files/Lines of Code Required Automated by MDD Process
Robot Assembly Basic SP 65 files (includes IDL/CIDL) generated files 54 files (includes IDL/CIDL) generated files For Robot Assembly number of files automated 60 (script files not generated yet..) For BasicSP 49 files are auto-generated
  • Experiment Execution
  • Totally we conducted 64 experiments for different
    combinations of Human Machine Interface Watch
    Setting Manager Components
  • The latency measures were tabulated to look for
    the configuration that minimized latency
  • Corresponding end-to-end measures were also
    checked

Automated execution of experiments scripts used
to set-up tear down experiments
19
Experimental Results / Highlights (2/3)
  • Observations
  • Similar configurations affected QoS similarly
  • For both cases we observed (G1,H1,I2,J2)
    minimized latency the most
  • Both cases showed that G is the most important
    configuration
  • Penalty for not setting G to G1 is 4 µsecs in
    BasicSP 60 µsecs in RobotAssembly
  • Other options are not important, i.e., setting
    them or leaving to defaults leads to same
    behavior
  • Figure shows a visualization of the configuration
    space
  • Circles represent a point in the configuration
    space
  • Edge represents the distance (performance)
    degradation from moving from one point to another

Defining operating regions enable setting more
important configurations allowing flexibility in
others
20
Experimental Results / Highlights (3/3)
  • How does platform affect QoS?
  • Providing feedback on deployment plan i.e.
    Provides Component Node mappings
  • BasicSP scenario
  • Tried two combinations as shown in table
  • Process
  • No changes required from earlier experiment
    capture same end-to-end latency
  • Change component node mapping to re-generate the
    deployment plan
  • Observe tabulate latency changes
  • Real-time component placement decided a priori
    software tied to the hardware
  • During failure
  • Important to decide where to place components to
    ensure QoS
  • This process aids for making this decision

21
Concluding Remarks
  • MDD process provides a flexible model-based
    approach for evaluating QoS of middleware
    configurations
  • Auto-generates most of the code required to run
    the experiment
  • OCML does not automatically generate
    configuration space
  • The script for automatically evaluating different
    configurations was not generated
  • Feedback to Planner allows refinement of
    configuration during testing phase
  • Our Future work
  • EMULab ns style script generation for easy
    simulation
  • Strategies for interfacing with higher level
    performance monitoring tools
  • Identifying patterns in configuration allows
    mapping features directly onto middleware
    configurations

22
Downloading the Middleware Tools
  • Beta stable releases can be accessed from
    http//www.dre.vanderbilt.edu/Download.html

OCML BGML are part of the CoSMIC MDD tool suite
  • http//www.dre.vanderbilt.edu/cosmic
Write a Comment
User Comments (0)
About PowerShow.com