Title: Northeast Ohios Consumer Guide to Hospital Quality A Case Study
1Northeast Ohios Consumer Guide to Hospital
QualityA Case Study
The Community Healthcare Coalition, Inc.
2Agenda
- Background Overview
- Review of Methodology
- Sample Reports
- Strategy for Public Release
- Vision for Next Iteration of Consumer Guide
(Version 2.0) - Questions
-
3Background Overview
4Background and OverviewThe Community Healthcare
Coalition, Inc.
5Background and OverviewCHC / EHPCO
- Founded in 1983 by 9 Canton employers to contain
healthcare costs. - More than 130 member companies domiciled in Ohio,
West Virginia, Illinois, California, Iowa, New
Hampshire and Pennsylvania.
6Background and OverviewCHC / EHPCO
- Monitor and lobby state and federal healthcare
legislation - Engaged in group purchasing for healthcare
benefits - Manage PBM contract for over 130 companies,
school districts and munici-palities, covering
nearly 400,000 lives. - Community Initiatives
7Background and OverviewTimeline
- High Level Process of Hospital Guide Development
Initial data set received (CY2001)
Refresh data set received (CY2002)
Identified Healthshare as data partner
Developed Consumer Version
Project Kick-off
8/03
10/03
12/03
2/04
4/04
Today
Roll-Out Hospital Quality Guide
Developed Employer Version
Identified target conditions and measures
Reviewed initial data set and analyzed results
- Developing the Consumer Guide to Hospital Quality
has been a collaborative effort between CHC,
Mercer Human Resource Consulting, and HealthShare
Technology
8Background and OverviewExperienced Group
- Key Players
- Mercer Human Resource Consulting
- Global consulting firm with employees in over 140
cities and 40 countries with local knowledge and
worldwide presence to develop and implement
market leading solutions. - Extensive experience in developing solutions that
focus on improving health care programs for large
purchasers, providers and government agencies. - Leading in the development of national quality
initiatives such as the Leapfrog Group, High
Performance Networks, Emerging models of Consumer
Driven Health Care (CDHC) and the Care Focused
Purchasing Initiative. - HealthShare Technology, Inc.
- HealthShares information and/or approach is the
basis for 6 other regional and national health
plan efforts regarding tiered networks, hospital
value indexes or hospital pay-for-performance
plans. - Tufts Health Plan developed hospital tiers based
on cost and quality for Massachusetts state
employees, with varying employee co-payments
dependent on the tier. - Health plans representing over 80 million members
currently offer HealthShares hospital comparison
tool online at their web sites for consumer
decisions pertaining to over 160
procedures/diagnoses.
9Review of Methodology
10Review of MethodologyWhy MEDPAR Data?
- Data used to rank hospitals was provided by
HealthShare Technology - Medicare cost report data, MEDPAR, from 2002
(the most recent year for which data is
available) was the basis for the rankings - At present, there is no publicly-available data
pertaining to the commercially insured population
in Ohio - Correlation between hospital performance using
Medicare data versus using all payer data
(inclusive of Medicare) is consistently strong,
both for quality and resource use metrics. (See
discussion on next page) - MEDPAR data is accepted in the marketplace as a
basis for objective ratings among healthcare
providers U.S. News and World Report also relies
on MedPar data for its annual Americas Best
Hospitals edition.
NOTE 2002 denotes the federal fiscal year (i.e.
fourth quarter 2001 and the first three quarters
of 2002).
11Review of MethodologyShortage of Public Data in
Ohio
- The National Association of Health Data
Organizations (NAHDO) monitors the reporting
requirements for healthcare data, both mandatory
and voluntary, within the U.S. (Sample below) - Overall, 38 states have mandatory collection
through a state agency.
12Review of MethodologyMEDPAR Data is a Valid
Predictor
- In order to evaluate how representative Medicare
data is as a predictor of a hospitals relative
performance across all patients, HealthShare
conducted a study in two states, New York and
Massachusetts. - HealthShare compared Medicare discharges with all
payer discharges using 2002 publicly available
data across all hospitals in NY and MA. - After adjusting for severity of illness using
RDRGs, Spearman Rank Correlation statistics were
calculated. (See table below)
Note Results using the Pearson Product-Moment
Correlation statistic were very similar.
13Review of MethodologyWhat Hospitals are Measured?
- Fifty-eight Ohio hospitals that were deemed
relevant to the membership of the Community
Healthcare Coalition (based on county) were
ranked for a group of clinical conditions/procedur
es for which each had experience in 2002. - It is important to note that comparisons were
drawn among hospitals using ALL of the Ohio
hospitals for which HealthShare has data, not
merely those hospitals in the CHC dataset. - Data pertaining to those Ohio hospitals that were
not included in the CHC dataset will not be made
available.
14Review of MethodologyWhat Hospitals are Measured?
Hospitals in the following counties were measured
and ranked Ashland, Ashtabula, Columbiana,
Cuyahoga, Erie, Geauga, Holmes, Huron, Lake,
Lorain, Mahoning, Medina, Portage, Richland,
Stark, Summit, Trumbull, Tuscarawas, Wayne. (star
locations may not accurately reflect true
geography)
15Review of MethodologyWhat Conditions and
Treatments are Measured?
Hospitals were measured and ranked for these ten
commonly measured conditions and/or procedures.
- General Surgery
- Colon Surgery
- Orthopedic Surgery
- Hip Replacement
- Knee Replacement
- Pulmonary Disease
- Pneumonia
- Chronic Obstructive Pulmonary Disease (COPD)
- Cardiac Care
- Abdominal Aortic Aneurysm Repair (AAA)
- Heart Attack/Angioplasty
- Coronary Artery Bypass Graft (CABG)
- Carotid Artery Surgery
- Cardiac Catheterization
16Review of MethodologyHow is the Data
Risk-Adjusted?
- Before meaningful comparisons can be drawn
between hospitals, the data must be risk adjusted
to account for the fact that some hospitals tend
to handle sicker patients than do others. - With the exception of procedure volume, ALL of
the aforementioned quality and resource use
metrics have been risk adjusted. - Risk adjustment was accomplished through Refined
Diagnosis Related Groups (RDRGs) which relate
common patient characteristics such as diagnosis,
procedures, age and gender to an expected
consumption of hospital resources and length of
stay. - 1,178 RDRGs (510 Medical, 616 Surgical, 26 Early
Death, 26 Other) - The principal diagnosis or procedure combined
with the secondary diagnosis place each patient
into an RDRG which reflects their severity
(minor, moderate, major, catastrophic).
17Review of MethodologyWhat are the Key Metrics?
- For each clinical condition and/or procedure,
hospitals were evaluated on the basis of multiple
metrics. - Some of the metrics address the quality of care
while others capture the efficiency of resource
utilization (Resource Use). - The metrics chosen for these rankings are a
sampling of those prevalent in the marketplace
which Mercer and HealthShare advocate. There are
other metrics by which hospitals could be
measured and compared that are not incorporated
in these rankings.
18Review of MethodologyWhat are the Key Metrics?
- QUALITY METRICS
- (1) Procedure Volume the total number of
patients treated in 2002 (federal fiscal year) - (2) Mortality percentage of patients with a
discharge disposition of expired (in-hospital
mortality) - (3) Major Complications percentage of patients
that have experienced one/more of the following
quality indicators - Pulmonary Compromise Urinary tract infection
- Acute Myocardial Infarction after
surgery Pneumonia after surgery - GI Hemorrhage Wound infection
- Venus Thrombosis/Pulmonary Embolism Adverse
effects - Mechanical complications (malfunction of
device/graft/implant) - (4) Failure to Rescue percentage of patients
that die following the development of a
complication. (Underlying assumption is that
good hospitals may not be able to prevent
complications, but they identify these
complications quickly and treat them aggressively
to prevent adverse outcomes such as death).
19Review of MethodologyWhat are the Key Metrics?
- RESOURCE USE METRICS
- (Informational use only, not reported in Consumer
Guide Version 1.0) - (1) Length of Stay (LOS) number of days patient
stays in healthcare facility calculated as admit
date minus the discharge date - (2) Short LOS outliers atypically short stays in
the hospital, where atypical is the bottom 5 of
all peer group hospitalizations - (3) Long LOS outliers atypically long stays in
the hospital, where atypical is the top 5 of all
peer group hospitalizations - (4) Total Charges average total charges of
patients discharged within a specified service
line. - (5) Total Cost average total costs of patients
discharged within a specified service line.
(Detail on next page) - Note that total charges/costs have no direct
relation to the actual reimbursement received by
the hospital from Medicare
20Review of MethodologyWhat are the Key Metrics?
- RESOURCE USE METRICS (cont) Charges and Costs
- Approach is basic ratio of costs to charges (RCC)
- RCCs calculated for each hospital department
using 2 data points from the specific hospitals
Medicare Cost Report (1) Charge, (2)
Total Cost after indirect allocation - Department specific RCCs applied against
department charges for each patient to derive
total and direct cost for each patient by dept. - Total costs include capital and exclude Direct
Medical Education (DME).
RCC Department Costs / Department Charges
Patient Cost for Dept RCC Patient Charges for
Dept
21Review of MethodologyHow are the Hospitals
Ranked?
- Every hospital was awarded a single quality
ranking and a single resource use ranking for
each of the ten conditions and/or procedures. - In order to arrive at these single ratings, each
metric (4 for quality and 5 for resource use) was
scored as a 1,2,3 or 4 where 1 signifies the best
performance and 4 signifies the worst. The score
for each metric translated into a specified
number of points depending on the desired weight
of that metric. - All of the metrics that support the resource use
ranking were weighted equally (20). - For the quality ranking, however, the four
metrics carry different weights - Procedure Volume 40 Mortality 20
- Major Complications 30 Failure to Rescue 10
22Review of MethodologyHow are the Hospitals
Ranked?
- Procedure Volume was weighted more heavily than
the other metrics because it has long been
associated with better quality outcomes. While
there are many reputable journal articles on this
subject, the following had the greatest impact on
the decision to weight this most heavily - Birkmeyer M.D., John D. Hospital Volume and
Surgical Mortality in the United States. The
New England Journal of Medicine (April 11, 2002)
1128-1137. - Kizer M.D., M.P.H., Kenneth W. The
Volume-Outcome Conundrum. The New England
Journal of Medicine (November 27, 2003)
2159-2161. - Failure to Rescue carried less weight than the
other quality metrics, not because its validity
as an indicator of quality is questionable, but
rather because it is a novel way of describing
quality.
23Review of Methodology How are the Hospitals
Ranked?
- The points were summed across all of the metrics
that impact resource use and SEPARATELY across
all of the metrics that impact quality of care. - The resource use and quality rankings are
mutually exclusive. - Finally, the hospitals were ranked (within each
condition) based on these point totals with
possible scores ranging from 1 to 10, where 1
signifies the best overall performance and 10
signifies the worst. - An example follows...
24Review of Methodology How are the Hospitals
Ranked?
- The following table shows how we arrived at the
QUALITY rank for Cardiac Catheterization
performed at Hospital X - Hospital X earned 43 out of 100 possible points
placing it in tier 8 of 10 compared to all other
hospitals that performed the procedure.
25Review of Methodology How are the Hospitals
Ranked?
- The following table shows how we arrived at the
RESOURCE USE rank for Cardiac Catheterization
performed at Hospital X - Hospital X earned 64 out of 100 possible points
placing it in tier 5 of 10 compared to all other
hospitals that performed the procedure.
26How to Understand the Employer and Consumer Guide
27Sample ReportsWhy Two Versions of the Same
Report?
- Ultimate goal was to provide a useful consumer
tool comparing cost AND quality - Due to managed care network discounts and the
artificial nature of charge data, CHC decided to
leave resource use out of the Consumer Version to
avoid confusion and frustration - However, CHC felt it important to provide
resource use data to employers and hospital
executives in the spirit of transparency and
disclosure - Corporate version is much more detailed, showing
each hospitals score for every indicator. The
consumer version rolls each score up to an
overall rating that is easier for consumers to
digest. - Receipt of Corporate Version is predicated upon
participation in a one-hour training session to
ensure understanding and responsible use
28Strategy for Public Release of the Consumer Guide
29Strategy for Public ReleaseUndisclosed to Ranked
Hospitals
- Experience of failed voluntary reporting systems
in other Ohio markets indicated that CHCs report
should remain undisclosed to ranked hospitals
until completed - Also, CHC had a desire to avoid endless
discussion and consensus building without
eventual public release of the data - Finally, CHC wanted to demonstrate to the
provider community that the employer community is
serious about and committed to transparency and
disclosure in the healthcare industry - Relied on expertise of Mercer and HealthShare as
well as academic literature to validate data and
methodology
30Strategy for Public ReleaseManaging Public
Relations
- Hired Strategy One, a local public relations
firm, to lend strategic and practical advice and
guidance - PR firm developed release strategy
- Sent two non-descript press releases in advance
of the public release announcing an upcoming
press conference - Provided confidential copies of the report to
trusted sources for advance story preparation - Allowed press access to both Corporate and
Consumer Versions of the Report in the spirit of
transparency - Focused on print and radio outlets rather than
visual media outlets due to the academic nature
of the story
31Strategy for Public ReleaseManaging Public
Relations
- Invitations to the Friday morning meeting were
the first notice that hospital - executives received of the report. Letters
were sent certified, directly to the - CEO, to ensure delivery and receipt.
32Strategy for Public ReleaseAn Assessment of the
Strategy
- Despite objections of the hospitals, CHCs
strategy was successful - Positive coverage of the guide in 5 major
newspapers - Akron Beacon Journal
- Canton Repository
- Youngtown Business Journal
- Lorain Daily News
- Warren Tribune
- Story on Cleveland NPR affiliate
- Articles presented in an informative light, with
limited negative quotes from hospital executives
33Strategy for Public ReleasePost-Release Activity
- Conducting one-on-one meetings with hospitals
- Conducting additional employer training meetings
as requested - Hosting a training session for practicing
physicians and/or office staff in late July - Founding a Quality Council for Northeast Ohio
- Partnership of the Akron Regional Hospital
Association - Support of the Ohio Hospital Association
- Quality Council will serve as advisory and
editorial board of all future versions of the
Consumer Guide to Hospital Quality
34Strategy for Public ReleaseAnticipated Use by
Employers
- To date, CHC has distributed 19,463 copies of the
Consumer Guide to employers for distribution to
associates. Many employers also plan to release
a PDF version via intranet or email - Two common strategies by employers
- Immediate release of the Guides with an
accompanying memo from the benefits manager - Delay distribution of Guides until Fall to
supplement open enrollment materials - CHC provided members with a template FAQ document
to address potential questions from associates
such as - Why do the results in this report differ from
others that I have seen? - What if my in-network hospitals do not rank well?
35Strategy for Public ReleaseCommon Hospital
Objections
- Frustration with the simultaneous release of the
report to the hospitals and the media. Hospitals
expressed a desire for an opportunity to peer
review or at least have time to prepare a
statement for the media - Strong objections to volume weighted at 40
- Concerns from small rural and community hospitals
- Suggestion to set a volume threshold rather than
reward mass-production of procedures and
treatments - Concerns about potential misuse by patients
- Emergency situations
- Generalizing about care overall based on total
number of 5 star rankings - Worry that consumers are becoming overwhelmed by
multiple sources of hospital quality comparison
and the potential for confusion about conflicting
results
36Vision for Next Iteration of Reports (Version
2.0)
37Vision Version 2.0
The vision for the next iteration of the Consumer
Guide to Hospital Quality would include the
following enhancements
- Incorporate commercial payer data into the
rankings. As a result, include more clinical
conditions and/or procedures, especially OB/GYN
and pediatric related services - Distribution in an interactive, online format as
opposed to pamphlets - Measure and rank all hospitals statewide
- Incorporate efficiency measures into the consumer
version - Measure physician quality/efficiency
38Questions