A System for Automatic Recording and Prediction of Design Quality Metrics - PowerPoint PPT Presentation

About This Presentation
Title:

A System for Automatic Recording and Prediction of Design Quality Metrics

Description:

... quality and design process must continually improve Currently, there are no standards or infrastructure for measuring and recording the semiconductor ... system ... – PowerPoint PPT presentation

Number of Views:163
Avg rating:3.0/5.0
Slides: 33
Provided by: Carla117
Learn more at: https://vlsicad.ucsd.edu
Category:

less

Transcript and Presenter's Notes

Title: A System for Automatic Recording and Prediction of Design Quality Metrics


1
A System for Automatic Recording and Prediction
of Design Quality Metrics
  • Andrew B. Kahng and Stefanus Mantik
  • UCSD CSE and ECE Depts., La Jolla, CA
  • UCLA CS Dept., Los Angeles, CA

2
Introduction
  • Time-to-market window is shrinking very rapidly
  • Product quality and design process must
    continually improve
  • Currently, there are no standards or
    infrastructure for measuring and recording the
    semiconductor design process
  • METRICS provides
  • Standard infrastructure for the collection and
    the storage of design process information
  • Standard list of design metrics and process
    metrics
  • Analyses and reports that are useful for design
    process optimization
  • METRICS allows Collect, Data-Mine, Measure,
    Diagnose, then Improve

3
Related Works
  • Enterprise- and project-level metrics (Numetrics
    Complexity Unit) Numetrics Management Systems
    DPMS
  • Other in-house data collection systems
  • e.g., TI (DAC 96 BOF), OxSigen LLC (Siemens
    Semiconductor)
  • Design Process Management Jacome93, Brockman92,
    Johnson96
  • Web-based design support
  • IPSymphony, WELD, VELA, etc.
  • Continuous process improvement
  • Data mining and visualization

4
Outline
  • METRICS system architecture and standards
  • METRICS for design flow
  • Flow METRICS experiments
  • METRICS integration with datamining
  • Datamining integration experiments
  • Issues and conclusions

5
METRICS System Architecture
6
Generic and Specific Tool Metrics
Partial list of metrics now being collected in
Oracle8i
7
Outline
  • METRICS system architecture and standards
  • METRICS for design flow
  • Flow METRICS experiments
  • METRICS integration with datamining
  • Datamining integration experiments
  • Issues and conclusions

8
Flow Metrics
  • Tool metrics alone are not enough
  • Design process consists of more than one tool
  • A given tool can be run multiple times
  • Design quality depends on the design flow and
    methodology (the order of the tools and the
    iteration within the flow)
  • Flow definition
  • Directed graph G (V,E)
  • V ? T ? S, F
  • T ? T1, T2, T3, , Tn (a set of tasks)
  • S ? starting node, F ? ending node
  • E ? Es1, E11, E12, , Exy (a set of edges)
  • Exy
  • x lt y ? forward path
  • x y ? self-loop
  • x gt y ? backward path

9
Flow Example
S
T1
T2
T3
Optional task
T4
F
Task sequence T1, T2, T1, T2, T3, T3, T3, T4,
T2, T1, T2, T4
10
Flow Tracking
Task sequence T1, T2, T1, T2, T3, T3, T3, T4,
T2, T1, T2, T4
11
NELSIS Flow Manager Integration
  • Flow managed by NELSIS

12
Optimization of Incremental Multilevel FM
Partitioning
  • Motivation Incremental Netlist Partitioning
  • netlist ECOs are made want top-down placement
    to remain similar to previous result
  • good approach CaldwellKM00 V-cycling based
    multilevel Fiduccia-Mattheyses
  • what is the best tuning of the approach for a
    given instance?
  • break up the ECO perturbation into multiple
    smaller perturbations?
  • starts of the partitioner?
  • within a specified CPU budget?

13
Optimization of Incremental Multilevel FM
Partitioning (contd.)
  • Given initial partitioning solution, CPU budget
    and instance perturbations (?I)
  • Find number of parts of incremental partitioning
    and number of starts
  • Ti incremental multilevel FM partitioning
  • Self-loop ? multistart
  • n ? number of breakups (?I ?1 ?2 ?3 ...
    ?n)

14
Multilevel FM Experiment Flow Setup
  • foreach testcase
  • foreach ?I
  • foreach CPUbudget
  • foreach breakup
  • Icurrent Iinitial
  • Scurrent Sinitial
  • for i 1 to n
  • Inext Icurrent ?i
  • run incremental multilevel FM partitioner
  • on Inext to produce Snext
  • if CPUcurrent gt CPUbudget then break
  • Icurrent Inext
  • Scurrent Snext
  • end

15
Flow Optimization Results
  • If (27401 lt num edges ? 34826) and (143.09 lt cpu
    time ? 165.28) and (perturbation delta ? 0.1)
    then num_inc_parts 4 and num_starts 3
  • If (27401 lt num edges ? 34826) and (85.27 lt cpu
    time ? 143.09) and (perturbation delta ? 0.1)
    then num_inc_parts 2 and num_starts 1
  • ...

16
Identifying the Effect of Wire Load Model
  • Wire load model (WLM) is used for pre-layout
    estimation of wire delays
  • Three different WLMs
  • statistical WLM
  • structural WLM
  • custom WLM
  • Motivation
  • identify if WLMs are useful for estimation
  • identify if WLMs are necessary for optimization
  • identify the best role of WLMs

17
Wire Load Model Flow
  • WLM flows for finding the appropriate role of WLM
  • T1 synthesis technology mapping
  • T2 load wire load model (WLM)
  • T3 pre-placement optimization
  • T4 placement
  • T5 post-placement optimization
  • T6 global routing
  • T7 final routing
  • T8 custom WLM generation

18
WLM Experiment Setup
  • foreach testcase
  • foreach WLM (statistical, structural, custom,
    and no WLM)
  • foreach flow variant
  • run PKS flow
  • if WLM structural then
  • generate custom WLM
  • end

6 different flow variants
19
WLM Flow Results
Slack comparison for 6 flow variants
  • Post-placement and pre-placement optimizations
    are important steps
  • Choice of WLM depends on the design
  • WLMs are still useful

20
Outline
  • METRICS system architecture and standards
  • METRICS for design flow
  • Flow METRICS experiments
  • METRICS integration with datamining
  • Datamining integration experiments
  • Issues and conclusions

21
Datamining Integration
Inter-/Intranet
DM Requests
SQL
Results
Tables
Database
Datamining Interface
Datamining Tool(s)
Tables
Tables
SQL
Results
22
Categories of Data for DataMining
  • Design instances and design parameters
  • attributes and metrics of the design instances
  • e.g., number of gates, target clock frequency,
    number of metal layers, etc.
  • CAD tools and invocation options
  • list of tools and user options that are available
  • e.g., tool version, optimism level, timing driven
    option, etc.
  • Design solutions and result qualities
  • qualities of the solutions obtained from given
    tools and design instances
  • e.g., number of timing violations, total tool
    runtime, layout area, etc.

23
Possible Usage of DataMining
  • Design instances and design parameters
  • CAD tools and invocation options
  • Design solutions and result qualities
  • Given ? and ?, estimate the expected quality of ?
  • e.g., runtime predictions, wirelength
    estimations, etc.
  • Given ? and ?, find the appropriate setting of ?
  • e.g., best value for a specific option, etc.
  • Given ? and ?, identify the subspace of ? that is
    doable for the tool
  • e.g., category of designs that are suitable for
    the given tools, etc.

24
Example Applications with DM
  • Parameter sensitivity analysis
  • input parameters that have the most impact on
    results
  • Field of use analysis
  • limits at which the tool will break
  • tool sweet spots at which the tool will give best
    results
  • Process monitoring
  • identify possible failure in the process (e.g.,
    timing constraints are too tight, row utilization
    is too high, etc.)
  • Resource monitoring
  • analysis of resource demands (e.g., disk space,
    memory, etc.)

25
DM Results QPlace CPU Time
  • If (num nets ? 7332) then CPU time 21.9
    0.0019 num cells 0.0005 num nets 0.07 num
    pads - 0.0002 num fixed cells
  • If (num overlap layers 0) and (num cells ?
    71413) and (TD routing option false) then CPU
    time -15.6 0.0888 num nets - 0.0559 num cells
    - 0.0015 num fixed cells - num routing layer
  • ...

26
Selection for Training and Test Sets
  • Random Case
  • randomly select runs assigned to training set
  • leave all remained (unselected) runs for test set
  • Distinct Case
  • split the test cases into two distinct sets, the
    training set and the test set
  • assign the runs accordingly
  • Representative Case
  • split the test cases into two distinct sets and
    assign the runs accordingly
  • for each test case in the test set, move exactly
    one run to the training set
  • I.e., for each test case, there is at least one
    representative run in the training set

27
Prediction Result Variances
Random Case
Distinct Case
Representative Case
Prediction of QP Wirelength
28
CTGen Results
Max Insertion Delay (ns)
Min Insertion Delay (ns)
Max Skew
29
Outline
  • METRICS system architecture and standards
  • METRICS for design flow
  • Flow METRICS experiments
  • METRICS integration with datamining
  • Datamining integration experiments
  • Issues and conclusions

30
Conclusions
  • Extensions to current METRICS system is presented
  • Complete prototype of METRICS system is working
    at UCLA with Oracle8i, Java Servlet and XML
    (other working prototypes are installed at Intel
    and Cadence)
  • METRICS wrappers for Cadence, Synopsys and UCLA
    tools and flows
  • METRICS system is integrated with Cubist
    datamining tool and NELSIS flow manager
  • A complete METRICS system can be installed on a
    laptop and configured to work behind firewalls

31
Issues and Ongoing Work
  • Issues for METRICS constituencies to solve
  • security proprietary and confidential
    information
  • standardization flow, terminology, data
    management, etc.
  • social big brother, collection of social
    metrics, etc.
  • Ongoing work with EDA, designer communities to
    identify tool metrics of interest
  • users metrics needed for design process
    insight, optimization
  • vendors implementation of the metrics
    requested, with standardized naming / semantics

32
Thank You
http//vlsicad.cs.ucla.edu/GSRC/METRICS
Write a Comment
User Comments (0)
About PowerShow.com