Title: GridBench: A Tool for Benchmarking Grids
1GridBench A Tool for Benchmarking Grids
- George Tsouloupas Marios Dikaiakos
- High Performance Computing Lab
- University of Cyprus
- georget,mdd_at_ucy.ac.cyhttp//grid.ucy.ac.cy
2Overview
- Benchmarking, Challenges and Users
- Related Work
- Our approach to performance evaluation
- GridBench Architecture and Metadata
- The Tool Interface
- Results
- Work in progress
- Conclusion
3Challenges
- Heterogeneous system
- Hardware, Software and Configuration
- Non-exclusive use of resources
- Dynamic environment
- Distinct administrative domains
- Find resources, execute benchmark, collect and
interpret results - In short too many variables.
4Benchmark Users
- End-users
- Need to know the capabilities of resources when
running similar codes. - Developers
- Develop and tune applications
- Compare job submission services, resource
allocation policies, scheduling algorithms
5Benchmark Users
- Architects/Administrators
- Improve system design
- Detect faults and misconfigurations (indicated by
unexpected results) - Compare implementations/systems
- Researchers
- Benchmarks can give insight into how Grids work
and perform - Could provide better understanding the nature of
grids in general.
6Related Work
- Probes Benchmark Probes for Grid Assessment
Chun et al. 2003 - Grid Benchmarking Research Group (Global Grid
Forum) (CIGB etc.)Specification Version 1.0
Wijngaart and Frumkin - Benchmarks for Grid Computing Snavely et al.
2003 - GridBench (CrossGRID - WP2)
- Prototype Documentation for GridBench version
1.0, 2003. - Software Requirements Specification, version 1.1
for GridBench , 2002. - GridBench A Tool for Benchmarking the Grid
Tsouloupas and Dikaiakos 2003
7Our requirements of a Grid benchmarking tool.
- Make it easy to conduct experiments
- Allow the measurement of different aspects of
the system's performance (Micro-benchmarks,probes
,application benchmarks) - Should maintain a history of measurements.
- Accomodate retrieval and comparizon of results
- Collect monitoring information to help with
result interpretation.
8Grid Infrastructure Architecture
Wide Area Network
Central Services (VO, Resource Broker, etc.)
VirtualOrganization
Site
Site
Site
Computing Element
Storage Element
Computing Element
Storage Element
Computing Element
Worker Node
Worker Node
Worker Node
Worker Node
Worker Node
Worker Node
Worker Node
Worker Node
Worker Node
Worker Node
Worker Node
Worker Node
Worker Node
Worker Node
9A layered approach to benchmarking
GridsInfrastructure viewpoint
Individual Resources(cluster nodes, Storage
Elements) Sites (clusters, SMPs) Grid
Constellation(multiple sites, Vos)
- Layers (Individual Resources, Sites,
Constellations) - Conjecture Layered approach provides a more
complete perspective on the system under study.
10A layered approach to benchmarking
GridsSoftware viewpoint
- Micro-benchmarks -- isolate basic performance
characteristics - Micro-kernel Benchmarks -- synthetic codes
- Application Benchmarks -- derived from real
applications
11GridBench A Tool for Benchmarking Grids.
- Provides a simple scheme for specifying benchmark
executions. - Provides a set of Benchmarks to characterize
Grids at several levels. - Provides mechanisms for executing benchmarks and
collecting the results. - Archives benchmark specifications and results for
comparison and interpretation. - Provides simple result management tools.
- Provides a user interface for the above
12Software Architecture Perspective
13Software Architecture
14GridBench Meta-data
benchmark
1..
1..
0..
metric
monitor
Component
0..
0..
1..
parameter
valueVector
parameter
0..
constraint
- GridBench Definition Language(XML-based)
- Definition and results co-exist (in archive) in
the same structure. - Intermediate form that allows for easy
transformation to different job desctiption
formats
corequisite/prerequisite
0..
0..
0..
parameter
monitor
0..
0..
metric
valueVector
0..
1..
location
resource
15GridBench Definition Language example
16Archival/Publication of Results
- Earlier versions
- Published to local MDS easy access by users and
schedulers - Recent Versions
- Benchmark results are archived in a native XML
database (Apache Xindice) - Flexibility.
- Allows for statistical analysis of results
- The Benchmark results are associated with
- GBDL definition. -- Results are meaningless
without the specific parameters - Monitoring data. -- Comprehension/Analysis of
results is enhanced when combined with monitoring
data.
17The Tool
The Definition Interface
1-Pick a benchmark
2- Configure it
18The Generated GBDL
19The Browsing Interface
List of Benchmark Executions
Query
Metrics from Selected Executions
Metrics
20Result management tools
Metrics from Selected Executions
(Can be used to compare Similar metrics)
Drag Drop
21EPStream submitted to three Computing Elements
Results from EPStream(screenshots)
- Different Colors represent different Worker Nodes
- Measures Memory Bandwidth in MB/s
Two EPStream submissions to cluster.ui.sav.sk
22MPPTest (blocking) submitted to three Computing
Elements
Results from MPPTest
Three MPPTest submissions to apelatis.grid.ucy.ac.
cyusing 2 and 4 nodes.
23Nine HPL executions on cluster.ui.sav.sk using
various parameters and number of nodes
Results from High Performance Linpack
24Summary
- Layered approach to Grid performance evaluation
- GridBench Definition Language
- Definition of how and where benchmarks will run.
- Automatic generation of job descriptions.
- Utility components
- Ease execution, and collection of results
- Result management
- GUI tool for running/browsing
- Easy execution on grid resources
- Initial set of results
25Conclusion
- The mechanism/meta-data for defining and
executing the benchmarks makes it very easy to
take measurements. - XML Storage of definitions and results proved
rather complicated to query, but quite flexible. - The tool prototype is in place, being tested, has
provided some initial results, and is ready for
the next revision. - Porting Benchmarks to the Grid not as
straight-forward as anticipated (heterogeneity of
resources, configuration, libraries) - Benchmarks are a great tool for detecting flaws
in hardware, software and configuration.
26Work-In-Progress
- Complete GBDL specification, possibly building
upon the work of the GGF Job Submission
Definition Language work-group - Implementation of more benchmarks focusing on
Application-based benchmarks (CrossGrid and other
applications)
Future Work
- Integaration with Monitoring tools
- Result interpretation and tools to assist
interpretation.
27Acknowledgments
- Funded by
- Part of
- In cooperation with
- Many thanks to Dr. Pedro TrancosoUniversity of
Cyprus.
28- Questions,
- Comments,
- Suggestions.
http//grid.ucy.ac.cyThank you.
29Additional Slides
30Translation to JDL/RSL
- XML-based GBDL to Job Description
- Support for simple jobs can be through the use of
simple templates. (executable, parameters and
locations are transformed to simple RSL/JDL) - Most benchmarks need special command-line
parameter formatting, or parameter files.
GBDL
Param Handler
Translator
JDL
RSL
...
31Translation Example GBDL to RSL
- RSL
- ((resourceManagerContactce1.grid.ucy")
- (label"subjob 0")
- (environment
- (GLOBUS_DUROC_SUBJOB_INDEX 0))
- (count2)
- (arguments"-n 1000")
- (executable"/bin/myexec" ))
- ((resourceManagerContact"ce2.grid.ucy")
- (label"subjob 1")
- (environment
- (GLOBUS_DUROC_SUBJOB_INDEX 1))
- (count2)
- (arguments"-n 1000")
- (executable"/bin/myexec" ))
32The Benchmark Suite
- Micro-benchmarks at the Worker-node level
- EPWhetstone embarrassingly parallel adaptation
of the serial whetstone benchmark. - EPStream adaptation of the Stream benchmark.
- BlasBench evaluate serial performance of the
BLAS routines. - Micro-benchmarks at the Site level
- Bonnie Storage I/O performance
- MPPTest MPI performance measurements
- Micro-benchmarks at the VO level
- MPPTest MPI performance measurements (spanning
sites) - gb_ftb File Transfer Benchmark
- Micro-kernels at the Site level
- High-Performance Linpack
- Selected kernels from the NAS Parallel Benchmarks
- Micro-kernels at the VO level
- Computationally Intensive Grid Benchmarks
33The Benchmark Suite (cont'd)
- Application-Kernel benchmarks at the site level
- CrossGrid application-kernels
- Application-Kernel benchmarks at the VO level
- CrossGrid Applications
34EPWhetstone submitted to two Computing Elements
Results from EPWhetstone
- Different Colors represent different Worker Nodes
- Measures Whetstone MIPS
Three EPWhetstone submissions to
apelatis.grid.ucy.ac.cy