Better Quality Through Better Measurement - PowerPoint PPT Presentation

1 / 159
About This Presentation
Title:

Better Quality Through Better Measurement

Description:

1. Better Quality Through Better Measurement ... Head of Information Management &T echnology Department. Conwy Denbighshire ... Lief Solberg, Gordon Mosser ... – PowerPoint PPT presentation

Number of Views:125
Avg rating:3.0/5.0
Slides: 160
Provided by: internatio77
Category:

less

Transcript and Presenter's Notes

Title: Better Quality Through Better Measurement


1
Better Quality Through Better Measurement
  • Prepared for the
  • 2008 International Forum on Quality and Safety
  • Prepared Presented by
  • Robert Lloyd, Ph.D.
  • Susanna Shouls
  • Dylan Williams
  • Paris April 23, 2008

2
Faculty
Robert Lloyd, Ph.D. (rlloyd_at_ihi.org)
Executive Director Performance
Improvement Institute for Healthcare Improvement,
USA Dylan Williams, BSc(Dylan.Williams_at_cd-tr.wale
s.nhs.uk) Head of Information Management T
echnology Department Conwy Denbighshire Trust,
Rhyl, Wales Susanna Shouls, MSc
(susanna.shouls_at_gstt.nhs.uk) End of Life Care
Programme Manager Lambeth Southwark Health
Economies, NHS
3
Purpose
To provide an overview of key measurement
strategies and tactics needed to navigate
successfully in the turbulent sea of data.
4
Discussion Topics
  • Distinguish between data for research, judgment
    and improvement.
  • Identify and build useful measures and start to
    build organizational dashboards
  • Understand variation conceptually and
    statistically
  • Apply run and control charts to selected
    measures
  • Integrate measurement skills into an overall
    quality improvement strategy

5
Understanding the Messiness of Life
Is life this simple? X Y
(If only it was this simple!)
6
No, it looks more like this
In this model there are numerous direct effects
between the independent and variables (the Xs)
and the dependent variable (Y).
X1
X4
Dependent or outcome variable
Y
X2
Independent Variables
X5
Time 3
X3
Time 2
Time 1
7
Actually, it looks like this
In this case, there are numerous direct and
indirect effects between the independent
variables and the dependent variable. For
example, X1 and X4 both have direct effects on Y
plus there is an indirect effect due to the
interaction of X1 and X4 conjointly on Y.
Key Reference on Causal Modeling H.M Blalock, Jr.
editor Causal Models in the Social Sciences.
Aldine publishing Co., 1999.
R1
X1
R4
X4
RY
Y
X2
R2
X5
R residuals or error terms representing the
effects of variables omitted in the model.
Time 3
X3
Time 2
R5
Time 1
R3
8
Any idiot can face a crisisits this day-to-day
living that wears you out! Anton Chekov
9
The Quality Pioneers
Walter Shewhart (1891 1967)
Joseph Juran (1904 - )
W. Edwards Deming (1900 - 1993)
10
Dr. Walter Shewhart
"Both pure and applied science have gradually
pushed further and further the requirements for
accuracy and precision. However, applied science,
is even more exacting than pure science in
certain matters of accuracy and precision."
11
The messiness of life requires applied science.
R1
X1
R4
X4
RY
Y
X2
R2
X5
Time 3
X3
Time 2
R5
Time 1
R3
12
And, you need to enjoy the messiness of life!
I REALLY do enjoy the messiness of life! Dont
you?
13
Why are you measuring?
Research?
Judgment?
Improvement?
The answer to this question will guide your
entire quality measurement journey!
14
The Three Faces of Performance Measurement
Improvement, Accountability and
ResearchbyLief Solberg, Gordon Mosser and
Sharon McDonaldJournal on Quality Improvement
vol. 23, no. 3, (March 1997), 135-147.
  • We are increasingly realizing not only how
    critical measurement is to the quality
    improvement we seek but also how
    counterproductive it can be to mix measurement
    for accountability or research with measurement
    for improvement.

15
The Three Faces of Performance Measurement ?
16
Control Chart - p-chart 11552 - Vaginal Birth
After Cesarean Section (VBAC) Rate
These data points are all common cause variation
Data for Improvement
Data for Judgment
17
Do you have a plan to guide your quality
measurement journey?
18
The Quality Measurement Journey
AIM (Why are you measuring?) Concept
Measure Operational Definitions Data
Collection Plan Data Collection
Analysis
ACTION
19
The Quality Measurement Journey
AIM freedom from harm Concept reduce
patient falls Measure IP falls rate (falls
per 1000 patient days) Operational Definitions -
falls/inpatient days Data Collection Plan
monthly no sampling all IP units Data
Collection unit submits data to RM RM
assembles and send to QM for analysis Analysis
control chart
Tests of Change
20
The Quality Measurement Journey
AIM (Why are you measuring?) Concept
Measure Operational Definitions Data
Collection Plan Data Collection
Analysis
ACTION
21
Three Types of Measures
  • Outcome Measures Voice of the customer or
    patient. How is the system performing? What is
    the result?
  • Process Measures Voice of the workings of the
    system. Are the parts/steps in the system
    performing as planned?
  • Balancing Measures Looking at a system from
    different directions/dimensions. What happened to
    the system as we improved the outcome and process
    measures? (e.g. unanticipated consequences, other
    factors influencing outcome)

22
Potential Set of Measures for Improvement in the
A E
23
Balancing Measures Looking at the System from
Different Dimensions
  • Outcome (quality, time)
  • Transaction (volume, no. of patients)
  • Productivity (cycle time, efficiency,
    utilization, flow, capacity, demand)
  • Financial (charges, staff hours, materials)
  • Appropriateness (validity, usefulness)
  • Patient satisfaction (surveys, customer
    complaints)
  • Staff satisfaction

24
Every concept can have many measures
  • Concept Potential Measures
  • Patient Falls Prevention Percent falls
  • Fall rate
  • Number of falls
  • C-Sections Percent C-sections
  • Number of C-Sections
  • C-Section rate
  • Employee Evaluations Percent of evaluations
    completed on time
  • Number of evaluations completed
  • Variance from due date

25
Exercise Identifying Measures
For each of the following concepts identify 2 - 3
measures and then indicate if the measure is an
Outcome, Process or Balancing Measure.
26
Operational Definitions
Would you tell me, please, which way I ought to
go from here, asked Alice? That depends a good
deal on where you want to get to, said the
Cat. I dont much care where - said
Alice. Then it doesnt matter which way you
go, said the Cat. From Alice in Wonderland,
Brimax Books, London, 1990.
This Way
Try This Way
No, This Way
27
An Operational Definition...
  • is a description, in quantifiable terms, of
    what to measure and the steps to follow to
    measure it consistently.
  • It gives communicable meaning to a concept
  • Is clear and unambiguous
  • Specifies measurement methods and equipment
  • Identifies criteria

28
Failure to develop a clear Operational Definition
often leads to confusion and misunderstanding
How do you define these concepts?
A fair tax A tax loophole The rich The poor The
middle class A wonderful restaurant
A good vacation A great movie A successful
marriage Rural, Urban or Suburban Global
warming Open trade agreements
29
The 9 planet operational definition
Xena
The 12 planet operational definition
30
The 8 planet operational definition
31
How do you define the following?
  • Reliable data abstraction
  • Staff productivity
  • Significant improvement
  • Timely technical assistance
  • Patient family satisfaction
  • Breakthrough
  • A culture of safety
  • A patient complaint
  • Surgery start time
  • A medication error
  • A complete HP
  • A patient fall
  • Good patient education
  • A readmission
  • A missed diagnosis
  • A short ED visit

32
Now that you have selected and defined your
measures, it is time to head out, cast your net
and actually gather some data!
USS Data Collector
Data
Data
Data
Data
Data
Data
Data
33
Key Data Collection Strategies
  • Stratification
  • Separation classification of data according to
    predetermined categories
  • Designed to discover patterns in the data
  • For example, are there differences by shift, time
    of day, day of week, severity of patients, age,
    gender or type of procedure?
  • Consider stratification BEFORE you collect the
    data
  • Sampling Strategies
  • Systematic sampling
  • Simple random sampling
  • Stratified sampling
  • Proportional stratified random sampling
  • Quota sampling
  • Judgment sampling
  • Convenience sampling
  • Knowledge of sampling techniques will make data
    collection much easier and less time consuming

34
Examples of Stratification Problems
35
The Relationships Between a Sample and the
Population
Population
What would a good sample look like?
Negative Outcome
Positive Outcome
36
The Relationships Between a Sample and the
Population
Population
A representative sample
A
Negative Outcome
Positive Outcome
Ideally a good sample will have the same shape
and location as the total population but have
fewer observations (curve A).
37
Sampling Bias
Population
C
B
A negatively biased sample
A positively biased sample
A
Negative Outcome
Positive Outcome
But a sample improperly pulled could result in a
positive sampling bias (curve B) or a negative
sampling bias (curve C).
How do you draw your samples?
38
Sampling Methods
  • Probability Sampling Methods
  • Simple random sampling
  • Stratified random sampling
  • Stratified proportional random sampling
  • Systematic sampling
  • Cluster sampling
  • Non-probability Sampling Methods
  • Convenience sampling
  • Quota sampling
  • Judgment sampling

39
Sampling Options
Simple Random Sampling
Population
Sample
Enumerative Approaches
Proportional Stratified Random Sampling
Sample
Population
S S M P M M M OB OB
S
Peds
Medical
OB
Surgical
Judgment Sampling
Jan
March
April
May
June
Feb
Analytic Approach
40
Exercise Operational Definitions
  • Select a process
  • Identify only one measure for this process
  • Develop an operational definition for this one
    measure that is clear and unambiguous
  • Offer recommendations on how you would collect
    data for this measure
  • Use the Operational Definition Worksheet to
    record your work
  • Spend about 15 minutes on this exercise then
    report out as groups

41
Voila!
You have performance data.Now what the heck do
you do with it?
42
The process of turning data into information for
decision-making
Deductive Phase (general to specific)
Theory and Prediction
Inductive Phase (specific to general)
Source R. Lloyd Quality Health Care, 2004, p.
153.
43
If I had to reduce my message for management to
just a few words, Id say it all had to do with
reducing variation. W. Edwards Deming
44
The Problem
Aggregated data presented in tabular formats or
with summary statistics, will not help you
measure the impact of process improvement/redesign
efforts. Aggregated data can only lead to
judgment, not to improvement.
45
Courtesy of Richard Lendon, Clinical Lead for
High Impact Changes, NHS, UK
46
Hospital Deaths
Courtesy of Richard Lendon, Clinical Lead for
High Impact Changes, NHS, UK
47
Performance Improvement Data Chest Pain in the ED
I know those of you in the back of the room
cant read these numbers and I apologize, but let
me summarize what they are supposed to tell you.






48
Cycle time results for units 1, 2 and 3
Unit 2

49
18 weeks referral to treatment time
Looking good? Is it time to celebrate?
Data source No Delays Achiever
www.institute.nhs.uk/nodelaysachiever
50
Are these distributions the same or different?
N 84 patients
N 84 patients
Source Commissioning for Patient Pathways Guide
www.institute.nhs.uk/nodelaysachiever/commissionin
g
51
Look at the distribution and referral to
treatment time data as well. Dont hide behind
a summary percentage.
Data source No Delays Achiever www.institute.nhs.
uk/nodelaysachiever
52
Finding the knowledge that lives in data!
  • Healthcare professionals have a very strong
    precedent for using aggregated data, summary
    statistics and tabular formats to analyze and
    portray the variation that lives in their data.
  • The visual display of data, however, provides a
    more revealing way not only to test theories
    about the data and gain new knowledge but also to
    understand the true variation that lives in the
    data.
  • Consider the four data sets shown on the next
    page and answer the following questions
  • Are the data sets the same or different?
  • What variation exists in these data sets?
  • How would you summarize the relationships
    between the X/Y pairs?

Sources for this exercise Anscombe, F. J.
Graphs in Statistical Analysis. American
Statistician, 27 February 1973 pp 17-21 and The
Improvement Handbook Models, Methods and Tools
for Improvement, API, Austin, TX, January 2005.
53
Anscombes Four Data Sets Are these data sets the
same or different?
54
Statistical Summary of Four Data Sets
  • Each data set has 11 data points for variables X
    and Y
  • Each data set has the same averages for the Xs
    (9.0) for the and Ys (7.5)
  • Each data set has the same correlation
    coefficient for X and Y (r .86)
  • Each data set has the same least squares
    regression equation
  • ( Y 3.0 .5X with r2 .667 and the
    standard error 1.24)

So, do you conclude that the four data sets are
the same or different? They all produce the
same results. Look at the scatterplots produced
by these four data sets on the next page. What
conclusions do you make now?.
55
Scatterplots of Anscombes Four Data Sets
56
Average CABG MortalityBefore and After the
Implementation of a New Protocol
WOW! A significant drop from 5 to 4
5.2
5.0
Percent Mortality
4.0
3.8
Time 1
Time 2
Conclusion -The protocol was a success!
A 20 drop in the average mortality!
57
Average CABG MortalityBefore and After the
Implementation of a New ProtocolA Second Look at
the Data
9.0
Protocol implemented here
UCL 6.0
Percent Mortality
5.0
CL 4.0
LCL 2.0
1.0
24 Months
Now what do you conclude about the impact of the
protocol?
58
The average of a set of numbers can be created by
many different distributions
X (CL)
Measure
Time
59
Sometimes gathering data can bring new and
surprising knowledge!
60
And Sometimes you discover that what you see does
not match reality!
Count the Black Dots! How many do you see?
61
If you dont understand the variation that lives
in your data, you will be tempted to ...
  • Deny the data (It doesnt fit my view of
    reality!)
  • See trends where there are no trends
  • Try to explain natural variation as special
    events
  • Blame and give credit to people for things over
    which they have no control
  • Distort the process that produced the data
  • Kill the messenger!

62
The goal is to make sense out of the data!
How many legs does this elephant have?
63
The
to understanding quality performance, therefore,
lies in understanding variation over time not in
preparing aggregated data and calculating summary
statistics!
64
How can I depict variation?
Static View
Descriptive Statistics
DYNAMIC VIEW Run Chart Control Chart (plot data
over time)
STATIC VIEW Descriptive Statistics Mean, Median
Mode Minimum/Maximum/Range Standard
Deviation Bar graphs/Pie charts
65
What is the variation in one system over time?
Walter A. Shewhart - early 1920s, Bell
Laboratories
Dynamic View
Static View
Static View
LCL
  • Every process displays variation
  • Controlled variation
  • stable, consistent pattern of variation
  • chance, constant causes
  • Special cause variation
  • assignable
  • pattern changes over time

Static View
66
Types of Variation
  • Common Cause Variation
  • Is inherent in the design of the process
  • Is due to regular, natural or ordinary causes
  • Affects all the outcomes of a process
  • Results in a stable process that is predictable
  • Also known as random or unassignable variation
  • Special Cause Variation
  • Is due to irregular or unnatural causes that are
    not inherent in the design of the process
  • Affect some, but not necessarily all aspects of
    the process
  • Results in an unstable process that is not
    predictable
  • Also known as non-random or assignable variation

67
Is this common cause or special cause?
Courtesy of Richard Lendon, Clinical Lead for
High Impact Changes, NHS, UK
68
Is this common cause or special cause?
Courtesy of Richard Lendon, Clinical Lead for
High Impact Changes, NHS, UK
69
A demonstration of Common Special Causes of
Variation
70
A classic example of common and special causes of
variation!
71
Point
Common Cause does not mean Good Variation. It
only means that the process is stable and
predictable. For example, if a patients
systolic blood pressure averaged around 165 and
was usually between 160 and 170 mmHg, this might
be stable and predictable but completely
unacceptable. Similarly Special Cause variation
should not be viewed as Bad Variation. You
could have a special cause that represents a very
good result (e.g., a low turnaround time), which
you would want to emulate. Special Cause merely
means that the process is unstable and
unpredictable.
You have to decide if the output of the process
is acceptable!
72
Common Cause Variation
Special Cause Variation
Normal Sinus Rhythm (a.k.a. Common Cause
Variation)
Atrial Flutter Rhythm (a.k.a. Special Cause
Variation)
73
SPC in a
74
How do we analyze variation for quality
improvement?
  • Run and Control Charts are the best tools to
    determine if our improvement strategies have had
    the desired effect.

75
The SPC Pioneers
Walter Shewhart (1891 1967)
Joseph Juran (1904 - )
W. Edwards Deming (1900 - 1993)
76
A Simple Improvement Plan
  • Which process do you want to improve or redesign?
  • Does the process contain non-random patterns or
    special causes?
  • How do you plan on actually making improvements?
    What strategies do you plan to follow to make
    things better?
  • What effect (if any) did your plan have on the
    process performance?

Run Control Charts will help you answer
Questions 2 4. YOU need to figure out the
answers to Questions 1 3.
77
How many data points?
Typically you should have between 15 20 data
points before constructing a chart
15 20 patients 15 20 days 15 20 weeks 15
20 months 15 - 20 quarters?
78
Elements of a Run Chart
The centerline (CL) on a Run Chart is the Median

X (CL)
Measure
Time
Four simple run rules are used to determine if
special cause variation is present
79
The Centerline on a Run Chart
Measures of Central Tendency Mean?
Median?
Mode?
Mean arithmetic average of data Median
middle value of ordered data Mode the most
frequent value
80
Why the Median rather than the Mean?
(n 1)/2 Median Position which leads you to
the Median Value
  • 8,10,11,14,16,18,20 Mean 13.8
  • Median 14
  • 8,10,11,14,16,18,95 Mean 24.5
  • Median 14
  • 1,10,11,14,16,18,20 Mean 12.8
  • Median 14

Median Position
Median Position
Median Position
81
How do you compute the Median when you have an
even number of data points?
(n 1)/2 Median Position which leads you to
the Median Value
  • 8,10,11,14,16,18,20,35 Mean 16.5
  • Median 15
  • 8,10,11,14,16,18,30,95 Mean 25.3
  • Median 15
  • 1,10,11,14,14,18,19,20 Mean 13.4
  • Median 14

Median Position
Median Position
Median Position
82
How do we count the number of runs?
What is a Run?
  • One or more consecutive data points on the same
    side of the Median
  • Do not include data points that fall on the Median
  • Draw a circle around each run and count the
    number of circles you have drawn
  • Count the number of times the sequence of data
    points crosses the Median and add 1

83
Run Chart Medical Waste How many runs are on
this chart?
Points on the Median (dont count these when
counting the number of runs)
84
Run Chart Medical Waste How many runs are on
this chart?
14 runs
Points on the Median (dont count these when
counting the number of runs)
85
Rules to Identify non-random patterns in the data
displayed on a Run Chart
  • Rule 1 A shift in the process, or too many data
    points in a run (6 or more consecutive points
    above or below the median)
  • Rule 2 A trend (5 or more consecutive points
    all increasing or decreasing)
  • Rule 3 Too many or too few runs (use a table to
    determine this one)
  • Rule 4 An astronomical data point

86
Non-Random Rules for Run Charts
A Shift 6 or more
A Trend 5 or more
Too many or too few runs
An astronomical data point
Source The Data Guide by L. Provost and S.
Murray, Austin, Texas, February, 2007 p3-10.
87
Rule 3 Too few or too many runs
Use this table by first calculating the number of
"useful observations" in your data set. This is
done by subtracting the number of data points on
the median from the total number of data points.
Then, find this number in the first column. The
lower number of runs is found in the second
column. The upper number of runs can be found in
the third column. If the number of runs in your
data falls below the lower limit or above the
upper limit then this is a signal of a special
cause. of Useful Lower
Number Upper Number
Observations
of Runs
of Runs 15 5 12 16
5 13 17 5 13 18 6
14 19 6 15 20 6
16 21 7 16 22 7
17 23 7 17 24 8
18 25 8 18 26 9 19 27
10 19 28 10 20 29
10 20 30 11 21
Total useful observations
Total data points
88
Rule 4 An Astronomical Data Point
What do you think about this data point? Is it
astronomical?
89
The Shewhart chart confirms the extreme nature of
this data point. It is statistically different
from all the rest of the points.
90
Total data points 29 Data points on the Median
2 Number of useful observations 27 (should
have between 10 19 runs) The number of runs
14 Number of times the data line crosses the
Median 13 1 14
Run Chart Medical Waste
Are there any non-random patterns present?
Points on the Median (dont count these as
useful observations)
91
Heres a question for you
Do run charts allow us to distinguish between
common and special causes of variation?
Not really! They enable us to determine if
there is statistical evidence of non-random
patterns but they do not allow us to actually
distinguish between common and special causes of
variation. For this you need Shewhart charts
with UCLs and LCLs.
92
Using a Run Chart to Improvementin Pre-Admission
Testing
  • The Problem
  • Too few same day surgery admissions (SDSAs) going
    through Pre-Admission testing (PAT).
  • Operational Definitions
  • Percent of SDSAs seen in PAT is calculated by
    dividing the number of patients seen in the
    peri-operative care unit on the day of surgery
    who were prepared by the PAT staff, by the total
    number of eligible SDSAs.
  • Eligible SDSAs are defined as adults who elect
    to have orthopedic general surgical or OB/GYN
    procedures.
  • Improvement Strategies
  • Create a centralized PAT area Revise forms
    used by the physician offices
  • Collect and track data on a daily basis
    Follow-up with patients

93
Run Chart for Pre-Admission Testing
Percent SDSAs seen in PAT
94
Run Chart for Pre-Admission Testing
Percent SDSAs seen in PAT
95
Using a Run Chart to track Renal Failure in
Cardiac Patients
  • Cardiac surgery patients were placed on a fast
    track recovery process
  • Are there negative impact of this change?
  • Surgeons were asked Do you see a difference
    with the outcomes? They all responded No.
  • Yet, the data show a very different picture.
    Too few runs plus an upward trend starting Oct
    1992.

96
How many runs are on this chart? Are there any
non-random patterns?
Patients with cardiac surgery on January 1-2,
1988 to February 1, 1996. The dates shown on the
x-axis are the dates the group ended. Only every
other group is labeled. Each point on the chart
represents 299 patients (7,475 total patients in
25 points on the chart). Source Page and
Washburn. Tracking Data to Find Complications
that Physicians Miss Joint
Commission Journal on Quality Improvement.
October 1997, p. 153.
97
An Upward Trend
Change to Fast Track
10 data points in this run
11 data points in this run
Too few runs plus an upward trend starting Oct
1992.
Patients with cardiac surgery on January 1-2,
1988 to February 1, 1996. The dates shown on the
x-axis are the dates the group ended. Only every
other group is labeled. Each point on the chart
represents 299 patients (7,475 total patients in
25 points on the chart). Note that only 4 runs
exist whereas 9-17 were expected. Source Page
and Washburn. Tracking Data to Find
Complications that Physicians Miss
Joint Commission Journal on Quality Improvement.
October 1997, p. 153.
98
Length of Stay for COPD
  • Is this a Run Chart? If not, what is it?

99
Lets make it a Run Chart!
  • Find the Median
  • Determine the useful observations
  • Apply the 4 run test rules

Finding the Median (N 1) / 2 Median Position
33 data points with 2 on the median Therefore we
have 31 useful observations
100
Now, lets analyze the Run Chart!
  • Find the Median
  • Determine the useful observations
  • Apply the 4 run test rules

How many runs on this chart? Are any special
causes present?
101
Identifying the number of runs
  • Find the Median
  • Determine the useful observations
  • Apply the 4 run test rules

12 runs (should be between 11 and 21 runs) Are
there more than 8 runs in a run above or below
the median? Are there 7 data points constantly
increasing? Do we see 14 data points constantly
switching back and forth (up, down, up, down,
etc.)?
102
Run Chart on Oral Surgery Referral Process
103
(No Transcript)
104
Why are Shewhart Charts preferredover Run Charts?
  • Because Control Charts
  • Are more sensitive than run charts
  • A run chart cannot detect special causes that are
    due to point-to-point variation (median versus
    the mean)
  • Tests for detecting special causes can be used
    with control charts
  • Have the added feature of control limits, which
    allow us to determine if the process is stable
    (common cause variation) or not stable (special
    cause variation).
  • Can be used to define process capability.
  • Allow us to more accurately predict process
    behavior and future performance.

105
Elements of a Shewhart Chart
An indication of a special cause
UCL
(Upper Control Limit)
X (CL)
Measure
LCL
(Lower Control Limit)
Time
106
s sd

The UCL and LCL are known as Sigma Limits
(SLs). They ARE NOT standard deviations! The
standard deviation (sd) is a statistic of a fixed
distribution. Sigma Limits (SLs) are parameters
of a process that is changing over time.
If you calculate a standard deviation (sd),
multiple it by 3 and then add and subtract this
value from the average, you will get the wrong
control limits! You usually need SPC software
to make control charts correctly!
107
The choice of a Control Chart depends on the Type
of Data you have collected
Variables Data
Attributes Data
Defects (occurrences only)
Defectives (occurrences plus non-occurrences)
Nonconformities
Nonconforming Units
108
There Are 7 Basic Control Charts
  • X R chart
  • (average range chart)
  • X S chart
  • (average SD chart)
  • XmR chart
  • (individuals moving range chart)

Variables Charts
Attributes Charts
  • p-chart
  • (proportion or percent of defectives)
  • np-chart
  • (number of defectives)
  • c-chart
  • (number of defects)
  • u-chart
  • (defect rate)

109
The Control Chart Decision Tree
Decide on the type of data
Variables Data
Attributes Data
Occurrences Non-occurrences?
More than one observation per subgroup?
Yes
Yes
No
No
lt than 10 observations per subgroup?
Are the subgroups of equal size?
Is there an equal area of opportunity?
Yes
Yes
Yes
No
No
No
np-chart
X bar R
X bar S
XmR
p-chart
u-chart
c-chart
110
Key Terms for Control Chart Selection
31
111
Is it an XmR (I) or X bar S?
112
The choice of a control chart depends on the
question you are trying to answer!
113
Developing Zones on a Shewhart chart
UCL
Zone A
3 SL
2 SL
Zone B
1 SL
Zone C
X (CL)
Measure of Quality
Zone C
-1 SL
Zone B
-2 SL
-3 SL
Zone A
LCL
Time
114
Rules to Identify Special Causes on Shewhart
Charts
There are many rules to detect special cause.
The following five rules are recommended for
general use and will meet most applications of
Shewhart charts in healthcare.
Rule 1 1 point outside the /- 3 sigma
limits Rule 2 8 successive consecutive points
above (or below) the centerline Rule 3 6 or
more consecutive points steadily increasing or
decreasing Rule 4 2 out of 3 successive
points in Zone A or beyond Rule 5 15
consecutive points in Zone C on either side of
the centerline
115
Shewhart Rules for Special Causes
Note Ties between two consecutive points do not
cancel or add to a trend. When Shewhart Charts
have varying limits due to varying numbers of
measurements within subgroups, then rule 3
should not be applied.
Note A point exactly on a control limit is not
considered outside the limit . When there is not
a lower or upper control limit Rule 1 does not
apply to the side missing the limit.
Note A point exactly on the centerline does not
cancel or count towards a shift.
Hugging the centerline
When there is not a lower or upper control limit
Rule 4 does not apply to the side missing a
limit.
116
Is there a Special Cause on this chart?
117
What special cause is on this chart?
118
The Chart Percent of Primary C-sections May 2004
July 2006 (27 months)
Goal
Goal lt 20
June05-Nov05
Dec06-May06
Is this a shift in the process? Maybe its an
upward trend?
Jan06
May04
Jan05
July06
119
u-chart
120
Wait Time to See the Doctor
Intervention
Where will the process go?
Baseline Period
Freeze the Control Limits and Centerline, extend
them and compare the new process performance to
these reference lines to determine if a special
cause has been introduced as a result of the
intervention.
121
Wait Time to See the Doctor
Intervention
Freeze the Control Limits and compare the new
process performance to the baseline using the
UCL, LCL and CL from the baseline period as
reference lines
A Special Cause is detected A run of 8 or more
data points on one side of the centerline
reflecting a sift in the process
Baseline Period

122
Wait Time to See the Doctor
Intervention
Make new control limits for the process to show
the improvement
Baseline Period
123
Case Study How are we doing in AE?
  • In the absence of Run or Control Charts
  • How could you explain progress?
  • Could you highlight specifics?
  • Did the Emergency Care Collaborative work?
  • Did we hit 95 of patient waiting lt 4hours
  • Isnt this good enough?
  • What do you expectperfection?

124
Run Chart on percent of patients waiting lt4 hours
in the AE
Goal 95
Median
125
Determine the Number of Runs
Expect 8ltRunslt17
Only have 6therefore
Special Cause
8 above median
Percent
126
Analyze the Run Chart
6 points above the median
Goal 95
24 total data points with 2 on the median which
gives 22 useful observations Rule 3 should be
7 -17 runs but we have only 6 runs
6 points below the median
127
Did we achieve the goal?
  • How were we doing in Nov 05?
  • Was this sustainable?
  • What can you tell about Dec 05 onwards
    performance based on history?

128
Now lets make a Shewhart Chart (p-chart)
Percent
129
Dont panic!
  • Without understanding the process variation you
    might
  • Tell people to work harder!
  • Shout! Point fingers! Blame someone!
  • Appoint someone to manage the unmanageable!
  • Waste precious time and resources chasing single
    data points!

130
In the NHS this happens all the time
  • Waiting time breaches admin staff chasing
    targets
  • AE wait time targets manage to target instead
    of improvement
  • Waiting 3 hours 59 minutes very good!
  • Waiting 4 hours 1 minute very bad!
  • What would it take to have 100 of all visits be
    lt4 hours? Is it possible?

131
Percent Baby Immunizations (lt2 yr HEDIS
combinations)
100
Target 90
Mean 63
50
Only 4 providers (out of 21) are at or above the
target
0
1 2 3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21
Individual Provider
132
Linking Measurement to Improvement
133
Okay, I found the right chart in here, now all I
have to do is to find the right improvement
strategy!
PDSA
Sphere of Knowledge
134
The Key Measurement Questions
  • What measures are you actually going to put on
    run or control charts?
  • What is the operational definition for each of
    the measures?
  • How will you collect valid and reliable data?
  • What strategy do you have for analyzing and
    interpreting the variation that lives in the
    data?

135
Control Charts Dont Tell You
  • The reasons(s) for a Special Cause
  • Whether or not a Common Cause process should be
    improved (Is the performance of the process
    acceptable?)
  • How the process should actually be improved or
    redesigned

136
Appropriate Management Response to Common
Special Causes of Variation
Is the process stable?
YES
NO
Common
Special Common
Type of variation
Investigate the origin of the special cause
Change the process
Right Choice
Change the process
Treat normal variation as a special cause
(tampering)
Wrong Choice
Consequences of making the wrong choice
Increased variation!
Wasted resources!
137
Overreacting to a Special Cause The Wrong Choice!
138
ExerciseMaking the Wrong Choice
  • Think of a time when someone either
  • Treated normal variation as if it were a special
    cause (i.e., tampered with a common cause
    process)
  • Or, overreacted to a special cause
  • What was the measure?
  • How did the individual(s) overreact?
  • What were the consequences of the overreaction?
  • Spend about 15 minutes on this exercise then
    report out as groups

139
What we gain from academic studies is knowledge.
What we gain from experience is wisdom.
Mohandas Gandhi
140
Now you have to deal with the messiness of life!
R1
X1
R4
X4
RY
Y
X2
R2
X5
Time 3
X3
Time 2
R5
Time 1
R3
141
Now the Hard Work Begins!
142
What are we trying to Accomplish?
Our focus today
The three questions provide the strategy
How will we know that a change is an improvement?
What change can we make that will result in
improvement?
The PDSA cycle provides the tactical approach to
work
Source Langley, et al. The Improvement Guide,
1996.
143
Repeated Use of the PDSA Cycle
Changes That Result in Improvement
Spread
DATA
Implementation of Change
Wide-Scale Tests of Change
Hunches Theories Ideas
Sequential building of knowledge under a wide
range of conditions
Follow-up Tests
Very Small Scale Test
144
Guidance for Testing a Change Concept
  • A test of change should answer a specific
    question!
  • A test of change requires a theory and a
    prediction!
  • Test on a small scale and collect data over time.
  • Build knowledge sequentially with multiple PDSA
  • cycles for each change idea.
  • Include a wide range of conditions in the
    sequence of tests.
  • Dont confuse a task with a test!

145
The Value of Failed Tests
  • I did not fail one thousand times I found one
    thousand ways how not to make a light bulb.
  • Thomas Edison

146
Learning from Failed Tests
147
Start Small 135All
  • 1 patient
  • 1 day
  • 1 admit
  • 1 physician

148
Why Test?Why Not Just Implement then Spread?
  • Testing will
  • Increase the degree of belief
  • Document expectations
  • Build a common understanding
  • Evaluate costs and side-effects
  • Explore theories and predictions
  • Test ideas under different conditions
  • Enable learning and adaptation

149
Why a little more testing might have been a good
idea!
150
The Sequence for Improvement
Spreading a change to other locations
Make part of routine operations
Test under a variety of conditions
Implementing a change
Testing a change
Theory and Prediction
Developing a change
151
Change is possible if we have the desire and
commitment to make it happen.
Mohandas Gandhi
152
The Primary Drivers of Improvement
Having the Will (desire) to change the current
state to one that is better
Will
QI
Having the capacity to apply CQI theories, tools
and techniques that enable the Execution of the
ideas
Developing Ideas that will contribute to making
processes and outcome better
Ideas
Execution
153
Key Components Self-Assessment
How prepared is your organization?
  • Will (to change)
  • Ideas
  • Execution
  • Low Medium High
  • Low Medium High
  • Low Medium High

All three components MUST be viewed together.
Focusing on one or even two of the components
will guarantee suboptimized performance. Systems
thinking lies at the heart of CQI!
154
Summary of Key Points
  • Understand why you are measuring performance
  • Improvement
  • Accountability
  • Research
  • Build skills in the area of data collection
  • Stratification
  • Sampling (probability versus non-probability)
  • Build knowledge on the nature of variation
  • Static view (enumerative statistics - measures
    of central tendency measures of dispersion)
  • Dynamic view (analytic statistics)
  • Common Special causes of variation
  • SPC with run control charts

155
Summary of Key Points (continued)
  • Statistical Process Control (SPC)
  • Run Chart
  • One way to make a run chart
  • Data plotted over time
  • The median is the centerline
  • Four run chart rules are used to detect
    non-random patterns
  • Shewhart (Control) Charts
  • Numerous types of charts (based on types of
    data)
  • Data plotted over time
  • The mean is the centerline
  • Upper and Lower Control Limits (sigma limits)
  • Rules to identify special and common causes of
    variation
  • Process stability and capability

156
General References on Quality
  • The Improvement Guide A Practical Approach to
    Enhancing Organizational Performance. G. Langley,
    K. Nolan, T. Nolan, C. Norman, L. Provost.
    Jossey-Bass Publishers., San Francisco, 1996.
  • Quality Improvement Through Planned
    Experimentation. 2nd edition. R. Moen, T. Nolan,
    L. Provost, McGraw-Hill, NY, 1998.
  • The Improvement Handbook. Associates in Process
    Improvement. Austin, TX, January, 2005.
  • A Primer on Leading the Improvement of Systems,
    Don M. Berwick, BMJ, 312 pp 619-622, 1996.
  • Accelerating the Pace of Improvement - An
    Interview with Thomas Nolan, Journal of Quality
    Improvement, Volume 23, No. 4, The Joint
    Commission, April, 1997.

157
References on Measurement
  • Brook, R. et. al. Health System Reform and
    Quality. Journal of the American Medical
    Association 276, no. 6 (1996) 476-480.
  • Carey, R. and Lloyd, R. Measuring Quality
    Improvement in healthcare A Guide to Statistical
    Process Control Applications. ASQ Press,
    Milwaukee, WI, 2001.
  • Langley, G. et. al. The Improvement Guide.
    Jossey-Bass Publishers, San Francisco, 1996.
  • Lloyd, R. Quality Health Care A Guide to
    Developing and Using Indicators. Jones and
    Bartlett Publishers, Sudbury, MA, 2004.
  • Nelson, E. et al, Report Cards or Instrument
    Panels Who Needs What? Journal of Quality
    Improvement, Volume 21, Number 4, April, 1995.
  • Solberg. L. et. al. The Three Faces of
    Performance Improvement Improvement,
    Accountability and Research. Journal of Quality
    Improvement 23, no.3 (1997) 135-147.

158
References on Spread
  • Gladwell, M. The Tipping Point. Boston Little,
    Brown and Company, 2000.
  • Kreitner, R. and Kinicki, A. Organizational
    Behavior (2nd ed.) Homewood, Il Irwin, 1978.
  • Lomas J, Enkin M, Anderson G. Opinion Leaders vs
    Audit and Feedback to Implement Practice
    Guidelines. JAMA, Vol. 265(17) May 1, 1991, pg.
    2202-2207.
  • Myers, D.G. Social Psychology (3rd ed.) New York
    McGraw-Hill, 1990.
  • Prochaska J., Norcross J., Diclemente C. In
    Search of How People Change, American
    Psychologist, September, 1992.
  • Rogers E. Diffusion of Innovations. New York The
    Free Press, 1995.
  • Wenger E. Communities of Practice. Cambridge, UK
    Cambridge University Press, 1998.


159
Quality begins with intent, which is fixed by
management. W. E. Deming, Out of the Crisis,
p.5
Write a Comment
User Comments (0)
About PowerShow.com