Title: A METRICS System for Design Process Optimization
1A METRICS System for Design Process Optimization
- Andrew B. Kahng and Stefanus Mantik
- UCSD CSE and ECE Depts., La Jolla, CA
- UCLA CS Dept., Los Angeles, CA
2Purpose of METRICS
- Standard infrastructure for the collection and
the storage of design process information - Standard list of design metrics and process
metrics - Analyses and reports that are useful for design
process optimization
METRICS allows Collect, Data-Mine, Measure,
Diagnose, then Improve
3METRICS System Architecture
4XML Example
ltMETRICS_LISTgt ltMETRIC PID134 FID22
TID47gt ltNAMEgtTOTAL_WIRELENGTHlt/NAMEgt ltVALUE
gt14250347lt/VALUEgt ltTYPEgtINTEGERlt/TYPEgt ltTIMEST
AMPgt010312220512lt/TIMESTAMPgt lt/METRICgt ltMETRIC
PID134 FID22 TID47gt ltNAMEgtTOTAL_CPU_TIM
Elt/NAMEgt ltVALUEgt2150.28lt/VALUEgt ltTYPEgtDOUBLElt/
TYPEgt ltTIMESTAMPgt010312220514lt/TIMESTAMPgt lt/ME
TRICgt lt/METRICS_LISTgt
5Transmitter Examples
- Wrapper-based transmitter
- !/usr/local/bin/perl -w
- TOOL 0
- PID initProject
- FID initFlow -pid PID
- TID initToolRun -pid PID -fid FID
- system sendMetrics TOOL_NAME TOOL\ STRING
-
- while(ltINgt)
-
- system sendMetrics NAME VALUE\ TYPE
-
-
- system terminateToolRun
- system terminateFlow -pid PID -fid FID
- system terminateProject -pid PID
- exit 0
- API-based transmitter
- include transmitter.h
- int main(int argc, char argv)
- Transmitter MTR
- MTR.initProject()
- MTR.initFlow()
- MTR.initToolRun()
- MTR.sendMetrics(TOOL_NAME, argv0,\
STRING) -
- MTR.sendMetrics(Name, Value, Type)
-
- MTR.terminateToolRun()
- MTR.terminateFlow()
- MTR.terminateProject()
- exit 0
6Example Reports
CPU_TIME 12 0.027 NUM_CELLS Correlation 0.93
7METRICS Server
Apache
Reports
Reporting Servlets
Oracle 8i
Requests
Transmitter Servlets
8Open Source Architecture
- METRICS components are industry standards
- e.g., Oracle 8i, Java servlets, XML, Apache web
server, PERL/TCL scripts, etc. - Custom generated codes for wrappers and APIs are
publicly available - collaboration in development of wrappers and APIs
- porting to different operating systems
- Codes are available at http//vlsicad.cs.ucla.edu
/GSRC/METRICS
9METRICS Standards
- Standard metrics naming across tools
- same name same meaning, independent of tool
supplier - generic metrics and tool-specific metrics
- no more ad hoc, incomparable log files
- Standard schema for metrics database
- Standard middleware for database interface
- For complete current lists see
http//vlsicad.cs.ucla.edu/GSRC/METRICS
10Generic and Specific Tool Metrics
Partial list of metrics now being collected in
Oracle8i
11Flow Metrics
- Tool metrics alone are not enough
- Design process consists of more than one tool
- A given tool can be run multiple times
- Design quality depends on the design flow and
methodology (the order of the tools and the
iteration within the flow) - Flow definition
- Directed graph G (V,E)
- V ? T ? S, F
- T ? T1, T2, T3, , Tn (a set of tasks)
- S ? starting node, F ? ending node
- E ? Es1, E11, E12, , Exy (a set of edges)
- Exy
- x lt y ? forward path
- x y ? self-loop
- x gt y ? backward path
12Flow Example
S
T1
T2
T3
Optional task
T4
F
Task sequence T1, T2, T1, T2, T3, T3, T3, T4,
T2, T1, T2, T4
13Flow Tracking
Task sequence T1, T2, T1, T2, T3, T3, T3, T4,
T2, T1, T2, T4
14Chip Design Flow Example
- Simple chip design flow
- T1 synthesis technology mapping
- T2 load wireload model (WLM)
- T3 pre-placement optimization
- T4 placement
- T5 post-placement optimization
- T6 global routing
- T7 final routing
- T8 custom WLM generation
15Optimization of Incremental Multilevel FM
Partitioning
- Motivation Incremental Netlist Partitioning
- Given initial partitioning solution, CPU budget
and instance perturbations (?I) - Find number of parts of incremental partitioning
and number of starts - Ti incremental multilevel FM partitioning
- Self-loop ? multistart
- n ? number of breakups (?I ?1 ?2 ?3 ...
?n)
16Flow Optimization Results
- If (27401 lt num edges ? 34826) and (143.09 lt cpu
time ? 165.28) and (perturbation delta ? 0.1)
then num_inc_parts 4 and num_starts 3 - If (27401 lt num edges ? 34826) and (85.27 lt cpu
time ? 143.09) and (perturbation delta ? 0.1)
then num_inc_parts 2 and num_starts 1 - ...
17Datamining Integration
Inter-/Intranet
DM Requests
SQL
Results
Tables
Database
Datamining Interface
Datamining Tool(s)
Tables
Tables
SQL
Results
18Categories of Data for DataMining
- Design instances and design parameters
- attributes and metrics of the design instances
- e.g., number of gates, target clock frequency,
number of metal layers, etc. - CAD tools and invocation options
- list of tools and user options that are available
- e.g., tool version, optimism level, timing driven
option, etc. - Design solutions and result qualities
- qualities of the solutions obtained from given
tools and design instances - e.g., number of timing violations, total tool
runtime, layout area, etc.
19Possible Usage of DataMining
- Design instances and design parameters
- CAD tools and invocation options
- Design solutions and result qualities
- Given ? and ?, estimate the expected quality of ?
- e.g., runtime predictions, wirelength
estimations, etc. - Given ? and ?, find the appropriate setting of ?
- e.g., best value for a specific option, etc.
- Given ? and ?, identify the subspace of ? that is
doable for the tool - e.g., category of designs that are suitable for
the given tools, etc.
20DM Results QPlace CPU Time
- If (num nets ? 7332) then CPU time 21.9
0.0019 num cells 0.0005 num nets 0.07 num
pads - 0.0002 num fixed cells - If (num overlap layers 0) and (num cells ?
71413) and (TD routing option false) then CPU
time -15.6 0.0888 num nets - 0.0559 num cells
- 0.0015 num fixed cells - num routing layer - ...
21Testbed Metricized Cadence PKS Flow
M E T R I C S
BuildGates
22NELSIS Flow Manager Integration
23Issues
- Tool interface each tool has unique interface
- Security proprietary and confidential
information - Standardization flow, terminology, data
management, etc. - Cost of metrics collection how many data are too
many? - Other non-EDA tools LSF, License Manager, etc.
- Social big brother, collection of social
metrics, etc. - Bug detection report the configuration that
trigger the bugs, etc.
24Conclusions
- Metrics collection should be automatic and
transparent - API-based transmitter is the best approach
- Ongoing work with EDA, designer communities to
identify tool metrics of interest - users metrics needed for design process
insight, optimization - vendors implementation of the metrics
requested, with standardized naming / semantics