Process Measurement - PowerPoint PPT Presentation

1 / 36
About This Presentation
Title:

Process Measurement

Description:

Measurement the act of quantifying the performance ... Discrete metric something ... Graphs, charts, visual aids. Daily information for ... – PowerPoint PPT presentation

Number of Views:151
Avg rating:3.0/5.0
Slides: 37
Provided by: james945
Category:

less

Transcript and Presenter's Notes

Title: Process Measurement


1
Chapter 4
  • Process Measurement

2
Process Metrics
  • Measurement the act of quantifying the
    performance dimensions of products, services,
    processes, and other business activities.
  • Measures and indicators - numerical information
    that results from measurement
  • Defects/unit
  • Errors/opportunity
  • dpmo

3
Types of Metrics
  • Discrete metric something that is countable
  • Continuous metric something concerned with the
    degree of conformance to specifications

4
Effective Metrics
  • SMART
  • simple,
  • measurable,
  • actionable (they provide a basis for
    decision-making),
  • related (to customer requirements and to each
    other), and
  • timely.

5
Identifying and Selecting Process Metrics
  • Identify all customers and their requirements and
    expectations
  • Define work processes
  • Define value-adding activities and process
    outputs
  • Develop measures for each key process
  • Evaluate measures for their usefulness

6
Data Collection
  • Key Questions
  • What questions are we trying to answer?
  • What type of data will we need to answer the
    question?
  • Where can we find the data?
  • Who can provide the data?
  • How can we collect the data with minimum effort
    and with minimum chance of error?

7
Operational Definition
  • Clear and unambiguous definition of a metric,
    e.g.
  • On time delivery
  • Error

8
Process Capability
  • The range over which the natural variation of a
    process occurs as determined by the system of
    common causes
  • Measured by the proportion of output that can be
    produced within design specifications

9
Process Capability Study
  1. Choose a representative machine or process
  2. Define the process conditions
  3. Select a representative operator
  4. Provide the right materials
  5. Specify the gauging or measurement method
  6. Record the measurements
  7. Construct a histogram and compute descriptive
    statistics mean and standard deviation
  8. Compare results with specified tolerances

10
Process Capability
11
Process Capability Index
The process capability index, Cp (sometimes
called the process potential index), is defined
as the ratio of the specification width to the
natural tolerance of the process. Cp relates the
natural variation of the process with the design
specifications in a single, quantitative measure.
12
Calculating Process Capability Indexes
UTL - LTL 6s
Cp
UTL - m 3s
Cpu
m - LTL 3s
Cpl
Cpl, Cpu
Cpk min
13
(No Transcript)
14
Types of Capability Studies
  • Peak performance study - how a process performs
    under ideal conditions
  • Process characterization study - how a process
    performs under actual operating conditions
  • Component variability study - relative
    contribution of different sources of variation
    (e.g., process factors, measurement system)

15
Spreadsheet Template
16
Dashboards and Scorecards
  • Dashboard collection of key operational
    measures
  • Graphs, charts, visual aids
  • Daily information for management and control
  • Balanced Scorecard summary of broad performance
    measures across the organization
  • Strategic guidance

17
Check Sheets
Check sheets are special types of data collection
forms in which the results may be interpreted on
the form directly without additional processing.
18
Check Sheet
  • Creates easy-to-understand data
  • Builds, with each observation, a clearer picture
    of the facts
  • Forces agreement on the definition of each
    condition or event of interest
  • Makes patterns in the data become obvious quickly

xx xxxxxx x
19
Sampling
  • What is the objective of the study?
  • What type of sample should be used?
  • What possible error might result from sampling?
  • What will the study cost?

20
Sampling Methods
  • Simple random sampling
  • Stratified sampling
  • Systematic sampling
  • Cluster sampling
  • Judgment sampling

21
Selecting a Sampling Plan
A good sampling plan should select a sample at
the lowest cost that will provide the best
possible representation of the population,
consistent with the objectives of precision and
reliability that have been determined for the
study.
22
Sampling Error
  • Sampling error (statistical error)
  • Nonsampling error (systematic error)
  • Factors to consider
  • Sample size
  • Appropriate sample design

23
Data Classification
  • Type of data
  • Cross-sectional data that are collected over a
    single period of time
  • Time series data collected over time
  • Number of variables
  • Univariate data consisting of a single variable
  • Multivariate data consisting of two or more
    (often related) variables

24
Sample Statistics
25
Excel Tools for Descriptive Statistics
  • ToolsData Analysis Descriptive Statistics
  • ToolsData AnalysisHistogram

26
Measurement System Evaluation
  • Whenever variation is observed in measurements,
    some portion is due to measurement system error.
    Some errors are systematic (called bias) others
    are random. The size of the errors relative to
    the measurement value can significantly affect
    the quality of the data and resulting decisions.

27
Metrology - Science of Measurement
  • Accuracy - closeness of agreement between an
    observed value and a standard
  • Precision - closeness of agreement between
    randomly selected individual measurements

28
Repeatability and Reproducibility
  • Repeatability (equipment variation) variation
    in multiple measurements by an individual using
    the same instrument.
  • Reproducibility (operator variation) - variation
    in the same measuring instrument used by
    different individuals

29
Repeatability Reproducibility Studies
  • Quantify and evaluate the capability of a
    measurement system
  • Select m operators and n parts
  • Calibrate the measuring instrument
  • Randomly measure each part by each operator for r
    trials
  • Compute key statistics to quantify repeatability
    and reproducibility

30
Spreadsheet Template
31
RR Evaluation
  • Under 10 error - OK
  • 10-30 error - may be OK
  • over 30 error - unacceptable

32
Calibration
One of the most important functions of metrology
is calibrationthe comparison of a measurement
device or system having a known relation-ship to
national standards against another device or
system whose relationship to national standards
is unknown.
33
Benchmarking
  • Benchmarking the search of industry best
    practices that lead to superior performance.
  • Best practices approaches that produce
    exceptional results, are usually innovative in
    terms of the use of technology or human
    resources, and are recognized by customers or
    industry experts.

34
Types of Benchmarking
  • Competitive benchmarking - studying products,
    processes, or business performance of competitors
    in the same industry to compare pricing,
    technical quality, features, and other quality or
    performance characteristics of products and
    services.
  • Process benchmarking focus on key work
    processes
  • Strategic benchmarking focus on how companies
    compete and strategies that lead to competitive
    advantage

35
Project Review Measure (1 of 2)
  • Team members have received any necessary
    just-in-time training
  • Key metrics for all CTQ characteristics have been
    defined
  • The team has determined what aspects of the
    problem need to be measured, including both
    process and results measures
  • Operational definitions of all measurements have
    been developed
  • All appropriate sources of data have been
    investigated, and a data collection plan
    established before data is collected

36
Project Review Measure (2 of 2)
  • Data collection forms have been tested and
    validated.
  • Sample sizes required for statistical precision
    have been identified.
  • Data have been collected in an appropriate
    fashion, according to plan
  • The data are accurate and reliable
  • Measurement systems have been evaluated using RR
    studies or other appropriate tools
  • Process capability has been addressed as
    appropriate
  • Benchmarks and best practice information has been
    collected
Write a Comment
User Comments (0)
About PowerShow.com