Performance Estimation of Distributed Real-time Embedded Systems by Discrete Event Simulations - PowerPoint PPT Presentation

About This Presentation
Title:

Performance Estimation of Distributed Real-time Embedded Systems by Discrete Event Simulations

Description:

Performance Estimation of Distributed Real-time Embedded Systems by Discrete Event Simulations Gabor Madl1, Nikil Dutt1, Sherif Abdelwahed2 1 University of California ... – PowerPoint PPT presentation

Number of Views:127
Avg rating:3.0/5.0
Slides: 25
Provided by: Gabo53
Learn more at: https://ics.uci.edu
Category:

less

Transcript and Presenter's Notes

Title: Performance Estimation of Distributed Real-time Embedded Systems by Discrete Event Simulations


1
Performance Estimation of Distributed Real-time
Embedded Systems by Discrete Event Simulations
  • Gabor Madl1, Nikil Dutt1, Sherif Abdelwahed2
  • 1 University of California, Irvine
  • 2 Mississippi State University
  • gabe, dutt_at_ics.uci.edu, sherif_at_ece.msstate.edu

2
Outline
Outline Motivation Modeling
Performance Estimation Real-time Verification
Evaluation
  • Motivation
  • Distributed Real-time Embedded (DRE) systems
  • Need to combine various analysis methods for
    performance estimation
  • Modeling DRE systems
  • Formal performance estimation
  • Branching intervals, race conditions
  • Event order tree
  • Real-time Verification
  • On-the-fly construction of the event order tree
  • Evaluation
  • Comparison with random simulations
  • Comparison with timed automata model checking

3
DRE Systems
Outline Motivation Modeling
Performance Estimation Real-time Verification
Evaluation
  • Distributed real-time embedded (DRE) systems are
    often reactive and event-driven
  • Better latency than in synchronous/time-triggered
    systems
  • Easier to implement, no need for global
    synchronization
  • Computations are driven by events complex model
  • Asynchrony, concurrency, race conditions
  • Hard to predict all behaviors
  • Performance estimation is a key challenge
  • Task execution times
  • Communication delays
  • Degree of parallelism
  • Throughput

4
Static Analysis Methods
Outline Motivation Modeling
Performance Estimation Real-time Verification
Evaluation
  • Classical scheduling theory Liu, Layland 1973
  • Symta/S Henia et al. 2005
  • Application of heterogeneous scheduling methods
    for component-based systems
  • Modular Performance Analysis Wandeler et al.
    2006
  • Application of real-time calculus for performance
    estimation
  • Synchronous languages Benveniste et al. 2003
  • Alternative approach for defining real-time
    constraints in globally synchronous systems with
    deterministic concurrency
  • Giotto Henzinger et al. 2003
  • Time-triggered language for schedulability
    analysis using the concept of logical execution
    time

5
Simulations
Outline Motivation Modeling
Performance Estimation Real-time Verification
Evaluation
  • Several variations used by the industry (e.g. RTL
    languages, SystemC, etc.)
  • Various abstraction levels trade off performance
    and accuracy
  • Ptolemy II Lee et al. 2001
  • Complex framework for the heterogeneous modeling
    of embedded systems
  • Focuses on non-deterministic systems
  • Simulation-based performance estimation method
    for MPSoCs Lahiri et al. 2001
  • Execution traces represented as symbolic graphs
  • Analysis is based on simulations of the symbolic
    model

6
Model Checking
Outline Motivation Modeling
Performance Estimation Real-time Verification
Evaluation
  • Task timed automata Ericsson et al. 1999
  • Approach for deciding non-preemptive
    schedulability of asynchronous event-driven task
    graphs
  • Task-level analysis of DRE systems Madl et al.
    2004, 2005, 2006
  • Extends task timed automata with event channels,
    schedulers, to model distributed real-time
    embedded systems
  • Thread-level analysis of real-time middleware
    using timed automata Subramonian et al. 2006
  • Detailed analysis of middleware services
  • Low-level abstraction

7
Evaluation of Existing Methods
Outline Motivation Modeling
Performance Estimation Real-time Verification
Evaluation
  • Static analysis methods
  • Often assume independent, periodic tasks
  • Cannot capture dynamic effects such as race
    conditions
  • Simulations
  • Can show the presence of an error, never its
    absence
  • Ad-hoc, hard to measure coverage
  • Limited design space exploration
  • Model checking
  • State space explosion problem
  • No partial results
  • Time consuming and costly
  • Each method has its advantage and disadvantage

8
Need to Combine Methods
Outline Motivation Modeling
Performance Estimation Real-time Verification
Evaluation
  • Providing formal guarantees on real-time
    properties is often infeasible in distributed
    asynchronous systems
  • Combine simulations and model checking to
    increase the coverage of simulations
  • Need to capture dynamic effects for accurate
    performance estimation
  • Race conditions, non-deterministic execution
    times, arbitration etc.
  • Need to capture execution intervals in continuous
    time as non-wcet deadline misses are possible in
    distributed asynchronous systems

9
Model-based Design and Analysis
Outline Motivation Modeling
Performance Estimation Real-time Verification
Evaluation
  • The domain-specific model (DSM) captures key
    properties of the design
  • The DSM is mapped to a formal executable model,
    that drives the real-time verification and
    performance estimation
  • The formal executable model can be mapped to
    heterogeneous models of computation (MoC)
  • The same formal model is used for both
    performance estimation and real-time verification
  • Parameters obtained by simulations

10
Model for Real-time Systems
Outline Motivation Modeling
Performance Estimation Real-time Verification
Evaluation
  • DRE T, M, C, TR, D
  • T is the set of tasks
  • M is the set of machines
  • C is the set of channels C ? T
  • TR is the set of timers TR ? T
  • D is the task dependency relationship D ? T ? T
  • machine(tk) T ? M
  • Tasks execution interval given by bcet, wcet
  • Channels are special tasks that model FIFO
    buffers and delays
  • Deadlines defined for tasks
  • Timers trigger execution of tasks periodically

11
Events
Outline Motivation Modeling
Performance Estimation Real-time Verification
Evaluation
  • DRE MoC is a continuous-time extension of DES
  • Can be mapped to timed automata
  • Event labels are time-stamped values from the
    domain of non-negative real numbers
  • Global time
  • During simulation, jump to event with the
    smallest timestamp
  • The tasks execution time is defined as a
    constraint between the timestamps of its input
    and output events
  • Tasks, channels, timers compose using events
  • Event flow follows dependencies

12
Task States, Schedulers
Outline Motivation Modeling
Performance Estimation Real-time Verification
Evaluation
  • Abstract model for scheduling
  • Fixed priority scheduler composes with tasks
    using events
  • Scheduler triggers transition from wait to run
    state
  • Channels are not triggered by scheduler
  • Scheduling policy is defined as priorities
    between transitions
  • Non-preemptive scheduling

13
Performance Estimation Problem
Outline Motivation Modeling
Performance Estimation Real-time Verification
Evaluation
  • A run or execution trace of the DRE model is the
    chronological sequence of events occurred in the
    model
  • In the DRE MoC two execution traces are
    equivalent, if they contain the same events, and
    the chronological order of the events is the same
  • We define the end-to-end computation time between
    two events as the maximum possible difference
    between those events along all the possible runs
    of the model
  • DRE model implies a partial ordering on events
  • Enumerate equivalent execution traces to obtain
    end-to-end computation time

14
Branching Intervals
Outline Motivation Modeling
Performance Estimation Real-time Verification
Evaluation
  • In the examples below, total orderings are as
    follows
  • iA,oA,iB,oB,iC,oC and iA,iC,oC,oA,iB,oB and
    iA,oA,iB,iC,oB,oC
  • Scheduler events omitted to keep the example
    simple
  • More total orderings are possible
  • Total ordering of events implies time constraints
    on the execution intervals of tasks
  • In the example shown in the left, As execution
    time has to be less than 3, otherwise oB will not
    happen before iC
  • Anytime the order may change, the event order
    tree has to branch to capture these cases

Branching Interval
15
Race Conditions
Outline Motivation Modeling
Performance Estimation Real-time Verification
Evaluation
  • If two tasks receive input events at the same
    time from tasks that are assigned to different
    machine, race conditions may be present
  • In the example shown on the right, there can be a
    race condition between task_C and task_D
  • Task_B and task_E are assigned to different
    machines
  • Race conditions can affect the total ordering of
    events
  • This problem may result in priority inversion

Race Condition
16
Event Order Tree
Outline Motivation Modeling
Performance Estimation Real-time Verification
Evaluation
  • Combination of branching intervals and race
    conditions are interesting corner cases
  • Extremely hard to find by simulations

Branching Points
Race Conditions
17
Real-time Verification by DES
Outline Motivation Modeling
Performance Estimation Real-time Verification
Evaluation
  • Event order tree provides a way for the real-time
    verification of a large class of DRE models
  • Proposed method is not guaranteed to terminate
  • Restriction the model has to return to the
    initial state
  • Otherwise finite horizon analysis, or use timed
    automata
  • If restriction above is satisfied, then the event
    order tree is repeatable it repeats itself from
    all the leaves
  • There are finite number of nodes in the tree
  • One simulation only for equivalent execution
    traces
  • Timers generate events at discrete time steps
  • Each task generates finite number of output
    events for each input event

18
On-the-fly Detection of Branching
Outline Motivation Modeling
Performance Estimation Real-time Verification
Evaluation
  • Both the performance estimation and real-time
    verification is based on the event order tree
  • Event order tree can be extremely large there is
    a need to obtain it on-the-fly
  • We utilize a DFS-based exploration of the tree
  • Based on repetitive simulations
  • Memory use is not significant
  • Each task detects its branching intervals during
    DES
  • Generate all permutations of branching intervals

19
Analysis Algorithm
Outline Motivation Modeling
Performance Estimation Real-time Verification
Evaluation
20
Case Studies Used for Analysis
Outline Motivation Modeling
Performance Estimation Real-time Verification
Evaluation
  • Set of examples based on real-time CORBA avionics
    applications
  • Software timers, asynchronous event propagation
  • Thread pool policy follows the half-sync
    half-async pattern
  • We have assumed non-preemptive scheduling for
    comparisons with other methods

21
Compared to Model Checking
Outline Motivation Modeling
Performance Estimation Real-time Verification
Evaluation
  • Model checking tailored to answer yes/no
    questions
  • Is the execution time less than x?
  • Storing timed states is a major contributor to
    memory use
  • Checking for previously visited states is
    inefficient
  • Realistic models are often too complex
  • Exhaustive analysis infeasible state space
    explosion problem
  • Compared performance to UPPAAL and Verimag IF
    tool
  • Timed automata model checkers faster in general
  • On larger models they run out of memory
  • Proposed method is CPU-bound partial results
    when exhaustive analysis is infeasible (most of
    the time)
  • Well suited for multicore/distributed
    implementation

22
Compared to Random Simulations
Outline Motivation Modeling
Performance Estimation Real-time Verification
Evaluation
  • Random simulations often revisit the same total
    orderings of events, hard to find rare events
  • Symbolic simulation model faster than naïve
    simulations
  • Simulation speed 30ms for 100 tasks, 1-5ms for
    5-15 tasks
  • DES-based method increases coverage over time
  • Focuses on corner cases
  • Branching intervals
  • Race conditions
  • Their combinations
  • Improves the existing practice of the ad-hoc
    combination of directed simulations and random
    simulations for performance estimation

23
DREAM Analysis Framework
Outline Motivation Modeling
Performance Estimation Real-time Verification
Evaluation
Open-source DREAM Tool Automatic timed automata
model generation for the UPPAAL and Verimag IF
tools Simulation-based model checking Performance
Estimation using DES Random testing Schedulability
optimizations
http//dre.sourceforge.net
24
Work in Progress Future Work
Outline Motivation Modeling
Performance Estimation Real-time Verification
Evaluation
  • DREAM was integrated in the Scenery project by
    Fujitsu Labs of America
  • Analyze real-time properties in embedded software
    and SoCs
  • Composing with other MoCs
  • System-level performance estimation
  • Wireless multi-media case studies
  • Looking for domain-specific modeling languages
    with large-scale DRE examples (hard to find)
  • Possible optimizations by hierarchical model
    checking
  • Distributing model checker algorithm to increase
    scalability
  • Not the main focus at the moment

25
Questions?
Write a Comment
User Comments (0)
About PowerShow.com