General Observations - PowerPoint PPT Presentation

1 / 12
About This Presentation
Title:

General Observations

Description:

Project / Subtask Description Form (cont.) Input / Output specifications / requirements ... I9) automatic experiment support. repeated execution of applications ... – PowerPoint PPT presentation

Number of Views:23
Avg rating:3.0/5.0
Slides: 13
Provided by: ZAM68
Category:

less

Transcript and Presenter's Notes

Title: General Observations


1
General Observations
  • Importance of Data
  • common glue underlying all tool components
  • precise specification, storage, handling,
    management important
  • database technology?
  • Differences in Content
  • prof ? gprof ? trace of profiles ? event traces
  • Incompleteness of Data
  • tool must be able to handle incomplete data
  • needs to know how to get missing / additional
    information
  • Conflicting / Contradictory Data
  • profiles ? traces
  • 1st run ? 2nd run

2
Experiments
Experiment
prepare / setup
DB
execute / run
control
analyze
  • Types of experiments
  • multi experiments (series of experiments)
  • macro experiments (off-line tool full program
    execution)
  • micro experiments (on-line tool one analysis
    loop)

3
Experiment Classification Space
  • Technology (how)
  • measurement
  • profiling
  • sampling, measured, ...
  • traces of profiles
  • event tracing
  • modeling
  • technique
  • analytical, simulation, ...
  • Goal
  • extrapolation, detailed analysis, ...
  • Metrics (what)
  • time
  • HW counters ...
  • Coverage (where / when)
  • location
  • machine
  • process
  • thread
  • program structure (regions)
  • program
  • module / file
  • function
  • basic block
  • statement
  • time
  • complete execution
  • partial

4
source repository
repository control system
source code
compilation system
executable
input data
output data
5
target config model
aspect model
extrapolation
model analysis
modeling results
6
(No Transcript)
7
Project / Subtask Description Form
  • Name
  • Description
  • What part of the analysis process does it
    automate/support?
  • Related / already existing tools / technology
  • uniqueness w.r.t. these technologies
  • reuse
  • What market does it serve?
  • Extent
  • student research project
  • PhD
  • research project
  • esprit rdt
  • plus Duration (person years)

8
Project / Subtask Description Form (cont.)
  • Input / Output specifications /
    requirements(external interface)
  • APIs
  • standards which can be used
  • possible standards which should be defined
  • Supported platforms
  • programming languages
  • programming models
  • machines
  • Evaluation of success
  • validation suite
  • benchmark suite
  • metrics

9
Theory (Methodology) ToDo List
  • T1) Properties Relationship structure (Tree?
    Graph?) structure walking algorithms to
    identify experiments to satisfy queries
  • T2) Refine "Confidence" concept (data model
    update / refinement) explore uses / association
    of quantity of data
  • T3) Experiment description language for
    expressing queries answers from experiments
    abstracted away from details how they are
    answered
  • T4) Add distributed computing model (client/serve
    r, multi-tier, ...)
  • T5) Add hierarchic model (MPIOpenMP)
  • T6) Add event trace model to data model

10
Implementation Projects
  • I1) Tool for checking performance properties
    with data access model (implementing properties
    as Java classes). Validate by hand on "trivial
    program.
  • I2) "Demonstration Prototype"
  • Initial Implementation of very simple
    experiment cycle
  • I2a) static analyzer / constraints /
    prof/gprof like measurements
  • I2b) based on message passing (MPI?) tracing
  • I3) "Prototype 2nd Generation"
  • - add experiment management, optimization
    planning
  • - validate on "trivial" programs
  • I4) query gt answer by experiment using existing
    tools main topic tool interoperability,
    wrappers, "standard" APIs, ...

11
Implementation Projects (cont.)
  • I5) "Prototype 3nd Generation"
  • I5a) use machine learning techniques
  • I5b) prioritization of constraints
    (predicted data may be replaced by
    monitor data later)
  • I6) User Interface Design
  • I7) user guidance ("wizard") for existing
    performance tools
  • I8) automatic instrumentation wrap existing
    instrumenters (compiler tools)
    providing "standard" instrumentation interface
  • I9) automatic experiment support repeated
    execution of applications with varying
    CPU numbers / input data to determine speedup
    /scalability / stability of results

12
Implementation Projects (cont.)
  • I10) automatic location of bottlenecks
  • I10a) based on prof/gprof like measurements
  • I10b) based on message passing (MPI?)
    tracing
  • I11) automatic comparison of different
    experiments of the same program -
    detection of the main differences - summarizing
    results
Write a Comment
User Comments (0)
About PowerShow.com