Title: The Ohio Mental Health Consumer Outcomes Initiative
1The Ohio Mental Health Consumer Outcomes
Initiative
- An Overview
- Dee Roth, M.A.
- ODMH Outcomes Team
2The ODMH Quality Agenda
- A key piece of the ODMH Quality Agenda
BestPractices
QualityImprovement
Quality
Consumer Outcomes
3Potential Benefits of Outcomes
- Standardized evidence of performance
- Ongoing self-evaluation and improvement
- Cost-benefit analyses
- Addresses external requirements/demands
- Better long-term relationships with external
stakeholders - May differentiate the organization from its
competitors - Organizational survival
4Outcomes Task Force (OTF)
- Origin
- Convened in September 1996 by Michael F. Hogan,
Ph.D., Director of ODMH - Charge
- Developing a statewide approach to measuring
consumer outcomes in Ohios publicly-supported
mental health system - Recommendations
- Vital Signs Report
5Outcomes Task Force (OTF)
- Membership
- A culturally diverse group of 42 consumers,
families, providers, boards, researchers and
evaluators and ODMH and ODADAS staff - Tenure
- Met two days per month for 16 months
6OTF Values
- Recovery philosophy drives service provision
- Providers and consumers share responsibility for
environment of hope and for service planning - Services driven by consumer-identified needs and
preferences
7OTF Values
- Accurate information needed for continuous
improvement of outcomes and for accountability - Methodologically sound and cost effective
outcomes measurement - Balance between improved information and
reasonable implementation
8OTF Assumptions
- Commonality
- A common set of desired outcomes is required for
measurement statewide - Ability to benchmark at both local and state
levels is a critical component of the use of
outcomes data for all stakeholders - Without a standard set of measurements to capture
outcomes, comparability across settings would be
impossible to achieve
9OTF Assumptions
- Integration with Other Data
- Outcomes data should be used with other data for
continuous quality improvement - Outcomes findings are indicators requiring
further exploration and planning - Availability
- All stakeholders should be able to use the
outcomes findings
10OTF Assumptions
- Consumer Perspective
- Outcomes should be measured primarily from
consumer perspective - Measures should complement the clinical judgment
of practitioners - Values-Based
- Incremental and innovative addition to Ohios
mental health system improvement - Should be evaluated to ensure that itfulfills
the OTF values
11Whats an Outcome?
- Indicators of health or well being for an
individual or family, measured by statements or
observed characteristics of the consumer/family,
not characteristics of the service system
12Ohio Mental HealthOutcomes System
- Clinical Status
- Level of symptom distress
- Ability to understand, recognize and manage/seek
help for symptoms, both physical and psychiatric
13OTF Outcomes
- Quality of Life
- Satisfaction with areas of life
- Feeling a sense of overall fulfillment, purpose,
hope and personal or parental empowerment - Attainment of personal/family goals
- Familys sense of balance between providing care
and participation in other life activities
14Ohio Mental HealthOutcomes System
- Functioning
- Using community resources to fulfill needs
- Developing and managing interpersonal
relationships - Activities of daily living
- Maintaining oneself independently
- Managing money
15Ohio Mental HealthOutcomes System
- Functioning
- Remaining in a home or family like environment
- Engaging in meaningful activity
- Avoiding justice system involvement
- Role functioning
- Addictive/compulsive behaviors
16Ohio Mental HealthOutcomes System
- Safety and Health
- Self-harm or suicide attempts
- Harm or neglect in persons environment
- Harm to others
- Physical health
17Ohio Mental HealthOutcomes System
- Safety and Health
- Medication concerns addressed
- Safety and health not threatened by disabilities,
discrimination or being treated with lack of
dignity
18Instrument Review Criteria
- The OTF used the following criteria to screen and
select outcome instruments - Direct and Indirect Cost
- Psychometric Properties
- Cultural Sensitivity
- Consistency with OTF Outcomes
- Consistency with Principles of CASSP (Child and
Adolescent Service System Program NIMH) - Consistency with Principles ofConsumer Recovery
19Ohio Outcomes Implementation Pilot Coordinating
Group (OIPCG)
- Membership
- Collaboration of 40 individuals representing
consumers, families, providers, local community
mental health/addiction boards, ODMH, others - Tenure
- Met 15 months in both plenary sessions and
workgroups
20OIPCG Mission
- Implement and test the OTF recommendations
- Design and test data flow processes and uses of
data - Based on findings of the pilot, make
recommendations to ODMH - Offer guidance to local systems on technical and
process elements of implementation, including - Data flow
- Hardware and software
- Staff training
- Local and state data use
21Guiding Principles
- Direct Care Staff Orientation
- The key to Outcomes Initiative success lies in
its ability to provide agency direct care staff
with timely and relevant information that can be
helpful in their work with consumers and families
22Guiding Principles
- Clarity and Consistency
- Good data are facilitated by good data collection
procedures and sources - All materials produced for the Outcomes
Initiative should be clear, consistent and
packaged for ease of use
23Guiding Principles
- Technological Achievability
- The Outcomes System should not require computer
technology beyond that already available in most
provider organizations for existing uses (e.g.,
MACSIS)
24Evaluation Methodology
- Consumer/Family Surveys
- Brief surveys were administered at first and
second administrations to - Assess understanding of the instruments
- Assess reaction to the instruments and items
- Learn how the information might be useful
- Determine how providers had communicated about
the outcomes information
25Evaluation Methodology
- Focus Groups with Provider Staff
- Focus groups were conducted regarding
- Project usefulness
- Perceived barriers to implementation
- Lessons learned
26Evaluation Methodology
- Cost Determination
- Data were collected from pilot programs to
determine - Amount of time spent with consumers
- Time and effort required for data flow
- Cost of instruments
- Technical costs (hardware and software)
27Evaluation Methodology
- Psychometrics
- Psychometric analyses of data from the adult
instruments were conducted to determine - Reliability/internal consistency
- Construct validity
28Evaluation Results Highlights
- Consumer Family Evaluations
- Useful
- Consumers and families were very clear and
emphatic about a number of ways in which outcomes
data can and should be used - Very Understandable
- 70 of all respondents (n 2,353) said the
questions were always or usually easy to
understand - 8 said questions were sometimes or never
easy to understand
29Evaluation Results Highlights
- Consumer Family Evaluations
- Good Comfort Level
- 60 of all respondents (n 2,353) said they felt
very comfortable or somewhat comfortable
answering the questions - 9 said they were somewhat uncomfortable or
very uncomfortable - Very Low Offensiveness
- No question was described as offensive by more
than two people
30Evaluation Results Highlights
- Consumer Family Evaluations
- Little Consumer/Staff Interaction
- Over half the respondents (n 866) said someone
talked to them about outcomes only a little or
not at all - Adult consumers reported having the least amount
of outcomes conversation with staff - Individuals who experienced outcomes not being
used by staff were more negative
31Evaluation Results Highlights
- Consumer Family Evaluations
- Additional Feedback
- 302 people (13 of the total) wrote additional
comments on the evaluation - 35 negative comments
- 21 positive comments
- Parents completing the BERS were most negative
32Evaluation Results Highlights
- Clinician/Administrator Focus Groups
- Value outcomes measurement
- Timely feedback is important
- Need specific data use training
- Low utility vs. high burden for some instruments
- Lack of integration between Outcomes and other
requirements
33Evaluation Results Highlights
- Costs
- Instruments
- Adult instruments free copying costs only
- Two of the kids instruments are proprietary
(CAFAS and BERS) average cost per
child/adolescent 2.47 per year plus CAFAS
training
34Evaluation Results Highlights
- Costs
- Administration Time
- Administration time varies by instrument from 5
minutes (Provider A) to 32 minutes (Consumer A) - About half of adult SMD consumers need some
assistance with filling out the survey - Data Entry
- Data entry costs vary by method used
35Evaluation Results Highlights
- Adult Instrument Psychometrics
- Reliability
- Reliabilities (Cronbachs a) for three sections
of the Adult Consumer Instruments - Symptom Distress .93(n 1,479)
- Quality of Life .86(n 1,442)
- Making Decisions Empowerment Scale .77(n
1,376)
36Final Instruments
- Adults with Severe Disabilities (Group A)
- Ohio Mental Health Outcomes Survey Adult
Consumer Form A - MHSIP Symptom Distress Scale
- Quality of Life Items from Lehman and Greenley
- Boston University Making DecisionsEmpowerment
Scale - Safety and Health Items
- Ohio Mental Health Outcomes Survey Provider
Adult Form A - Functioning (Modified Multnomah Community
Ability Scale) - Safety and Health Items
37Final Instruments
- Other Adults (Group B)
- Ohio Mental Health Outcomes Survey Adult
Consumer Form B - MHSIP Symptom Distress Scale
- Quality of Life Items from Lehman and Greenley
38Final Instruments
- Children and Adolescents
- The Ohio Youth Problem, Functioning and
Satisfaction Scales - Problem Severity
- Functioning
- Hopefulness
- Three Perspectives
- Parent
- Agency Worker
- Youth (Ages 12-18)
39Administration Intervals
- Adults with Severe Disabilities (Group A)
- Adult Consumer Form AProvider Adult Form A
- Intake
- 6 months
- 12 months
- Annually thereafter, or at termination,
whichever comes first
40Administration Intervals
- Other Adults (Group B)
- Adult Consumer Form B
- Intake
- At or as close to termination as possible
41Administration Intervals
- Children and Adolescents
- Ohio Scales
- Intake
- 6 months
- 12 Months
- Annually thereafter, or at termination,
whichever comes first
42Using Outcomes Data
- Consumer
- Recovery
- Advocacy
- Provider
- Care Management and Treatment Planning
- Agency Quality Improvement
- Clinical Supervision
43Using Outcomes Data
- Board
- Board-Area Monitoring
- Board-Area Quality Improvement
- State
- Statewide Benchmarking
- Statewide Quality Improvement
445122-28-04 Consumer Outcomes
- Replaces Service Evaluation Rule
- Effective September 4, 2003
- Applicable to most agencies providing services
with public dollars
45Consumer Outcomes Timelines
- March 4, 2004 Collecting data
- September 4, 2004 Flowing production data to
ODMH - September 4, 2005 Evidence of use of Outcomes
data in both treatment planning and agency
performance improvement
46Todays Outcomes Training
- Data use for direct care staff and consumers
- Tools available to help you
- Data flow process
47The Ohio Mental Health Consumer Outcomes
Initiative
- An Overview
- Dee Roth, M.A.
- ODMH Outcomes Team
48The Ohio Mental Health Consumer Outcomes
Initiative
- Tools for Implementation
- Leslie Brower, Ph.D., R.N.
- Stacy Keenan, M.S.
- ODMH Outcomes Team
49Products Resources
- The following products are available
- Consumer Outcomes Procedural Manual
- Outcomes Initiative Web Site
- Outcomes Instruments
- Implementation Planning Checklist
- Outcomes Toolkit
- Outcomes Implementation Update Newsletter
50Products Resources
- The following resources are available
- Data Flow Guide
- Data Entry and Reports Template
- Users Guide
- Data Reports
- Missing Data Report
- Comparative Reports
- Outcomes Workgroups
- Outcomes Support Team E-Mail
51Procedural Manual
- Outcomes System Background Chapters
- Preface
- The Ohio Mental Health Consumer Outcomes System
- Outcomes Instruments and Administration
Guidelines - Users and Uses of Consumer Outcomes Data
52Procedural Manual
- Outcomes Instrument Chapters
- Adult Consumer Form A
- Provider Adult Form A
- Adult Consumer Form B
- Ohio Youth Problem, Functioning, and
Satisfaction Scales Short Form (Ohio Scales)
53Procedural Manual
- System Mechanics Chapters
- Processing Outcomes Data
- System Fidelity Checklist
- Additional Resources
- References
54Procedural Manual
- Preface
- This chapter provides a general orientation to
the context of the Procedural Manual
55Procedural Manual
- The Ohio Mental Health Consumer Outcomes System
- This chapter describes the structure and history
of the Ohio Mental Health Consumer Outcomes
System
56Procedural Manual
- Outcomes Instruments and Administration
Guidelines - This chapter reviews the instruments selected for
inclusion in the Outcomes System and provides
guidelines for selecting and administering the
appropriate instrument(s)
57Procedural Manual
- Users and Uses of Consumer Outcomes Data
- This chapter describes ways various constituent
groups can make use of the information provided
by the Outcomes System
58Procedural Manual
- Instrument Chapters Include
- Focus and Intent
- Scales and Items
- Cautions and Qualifications
- Respondent Eligibility and Characteristics
- Administration Intervals Protocol
- Scoring
- Analysis and Interpretation
- How Can Data from the Instrument be Used?
- Psychometric Properties
- System Fidelity Checklist
- Copy of the Instrument
59Procedural Manual
- Adult Consumer Form A
- This chapter describes the Outcomes instrument
that is used for adults with severe and
persistent mental illness
60Procedural Manual
- Provider Adult Form A
- This chapter describes the Outcomes instrument
that is used by provider agency workers for
adults with severe and persistent mental illness
61Procedural Manual
- Adult Consumer Form B
- This chapter describes the Outcomes instrument
that is used for adults with less severe
illnesses who seek mental health services for
resolution of short-term difficulties
62Procedural Manual
- Ohio Youth Problem, Functioning and Satisfaction
Scales Short Form (Ohio Scales) - This chapter describes the Outcomes instruments
that are used for child and adolescent consumers,
their family members and their provider agency
workers
63Procedural Manual
- Processing Outcomes Data
- This chapter briefly provides a general overview
and quick reference for the processing of
Outcomes System data
64Procedural Manual
- System Fidelity Checklist Appendix A
- This appendix provides a global checklist that
includes all system fidelity items identified in
instrument chapters of the Procedural Manual
65Procedural Manual
- Additional Resources - Appendix B
- This appendix describes additional resources that
are available to individuals who are either
participating or simply interested in the Ohio
Mental Health Consumer Outcomes System
66Procedural Manual
- References - Appendix C
- This appendix provides citations for articles,
publications and studies referenced elsewhere in
the Procedural Manual
67Outcomes Web Sitehttp//www.mh.state.oh.us/initia
tives/outcomes/outcomes.html
- Comprehensive repository for virtually all
Outcomes resources (downloadable as PDF, Word and
other common files) - Join e-mail list to get all the latest
information and releases - Check the status of data flow
- Obtain statewide reports
- Share ideas with other systems
68Outcomes Web Site
69Outcomes Instruments
- Electronic versions of all instruments
- Download and print instruments for local use
- Adult forms are free of charge
- Ohio Scales for youth are free to Ohio users
others may obtain for a nominal charge - Instruments available in Spanish, Russian,
Chinese, Japanese and Korean
70Implementation Planning Checklist
- Developed by experienced boards and providers to
guide planning - Voluntary but highly recommended
- Flexible format for specific local needs
71Implementation Planning Checklist (cont.)
- Encourages collaboration locally between boards,
providers and consumers and families - Includes
- Awareness
- Team building, readiness assessment
- Decision-making
- Testing, evaluation, revision
- Implementation
- Continuous improvement
72Outcomes Toolkit
- Developed by experienced local boards, providers,
consumers and families - Distributed to all boards and agencies receiving
Incentive Grants (2000) - Includes educational materials to assist
provider agency implementation - All products available on Web site
73Outcomes Toolkit (cont.)
- Waiting room video
- Its About You youth video
- Getting Results brochure
- Can be labeled for local system or agency
- Direct Care Staff video
- Clinical Supervisor video
- Climbing Into the Drivers Seat curriculum
- For adult consumers (contact Ohio Advocates for
Mental Health for training) - Agency re-engineering manual
74Products Resources
- The following resources are available
- Data Flow Guide
- Data Entry and Reports Template
- Users Guide
- Data Reports
- Missing Data Report
- Comparative Reports
- Outcomes Workgroups
- Outcomes Support Team E-Mail
75Data Flow Guide
- The Guide contains information about
- Preparing for Data Flow
- Selecting and implementing technology
- Integrating Outcomes into existing processes
- Staff responsibilities
- Creating Records and Files
- Data specifications
- Required, key, and warning fields
- Scoring
- Naming files
76Data Flow Guide
- The Guide contains information about
- Data Flow Testing Process
- Board-level processing, submitting files to ODMH
- Critical errors in files and records
- Receiving test results from ODMH
- Data Flow Production Process
- Submitting files to a board, ODMH
- Critical errors in files and records
- Receiving production results from ODMH
- Appendices
77Data Entry and Reports Template
- Basic tool to support 3 functions
- Data entry and editing
- Data storage and exporting
- Reporting
- Microsoft Access 97/2000 application
- Download it from the Web site FREE!
- Template Users Guide available on Web
78Data Reports
- Missing Data Report
- Produced each quarter
- Percentages of consumers in the system who have
at least one Outcomes rating for a specified
period of time - Allows agencies and boards to see how they are
doing with regard to Outcomes implementation, in
comparison with others both in and outside of
their local area
79(No Transcript)
80Data Reports
- Comparative reports
- Produced each quarter
- Alternate between state of the state report and
special topic report - Provide constituents in the mental health system
with statewide data that they can use to compare
an individuals scores or average agency or board
area scores
81Data Reports
82Outcomes Workgroups
- Continued work is needed to ensure that Outcomes
data are used effectively - Outcomes Data Reports Workgroup
- Outcomes Data Mart Committee
- Outcomes Initiative Evaluation Workgroup
83Outcomes Workgroups
- Outcomes Data Reports Workgroup
- The multi-constituency Statewide Outcomes Data
Reports Workgroup has been convened to advise
ODMH on the format and protocols for reports
based on the statewide Consumer Outcomes database
84Outcomes Workgroups
- Outcomes Data Mart Committee
- A statewide committee has been convened to advise
ODMH on the format and content of a Data Mart
based on the statewide Consumer Outcomes database - Currently in the general design phase
85Outcomes Workgroups
- Outcomes Initiative Evaluation Workgroup
- An evaluation workgroup will be convened to
develop the framework for an evaluation that will
guide any necessary revisions of the Outcomes
System
86Outcomes Support Team
- Were here to help you!
- Stacy Keenan Geoff Grove
- outcome_at_mh.state.oh.us
- 614-644-7840
87The Ohio Mental Health Consumer Outcomes
Initiative
- Tools for Implementation
- Leslie Brower, Ph.D., R.N.
- Stacy Keenan, M.S.
- ODMH Outcomes Team
88The Ohio Mental Health Consumer Outcomes
Initiative
- Mastering the Outcomes Data Flow Process
- Stacy Keenan, M.S.
- Geoff Grove, M.A.
- ODMH Outcomes Support Team
89Goals of Session
- Describe the Outcomes data flow process
- Preparing for data flow
- Collecting and storing data at the local level
- Data flow testing process
- Production data flow
- Provide current status report on statewide
Outcomes data flow
90Preparing for Data Flow
- Gather and review existing resources
- Review, select and implement technology
- Local decision
- Integrate Outcomes data flow into existing
processes - Determine staff responsibilities
- Who manages the data flow process?
- Who collects and enters data?
- Who transmits data and how often?
91Data Collection and Storage
- Instrument is completednow what?
- Goal is to collect a consumers responses and to
store them electronically - Create a database!
- Become familiar with the basics
- Build it according to ODMH Data Specifications
- Name files according to ODMH guidelines
- Test files begin with t
- Production files begin with h
92Data Flow Testing
- Required for all participating providers
- Helps to ensure data quality
- Must be approved for production before
submitting data to statewide database - Must submit test file for each instrument being
used - Minimum of 10 records per test file required
- Test files should contain realistic Outcomes
data
93Data Flow Testing (contd)
- Provider creates test file, sends it to board
- Method and frequency of transfer is local
decision - Board conducts minimal tests on file
- Checks that the file is named correctly and
adheres to ODMH data specifications - Confirms that file is not a duplicate
- Uses ASCII editor to be sure file is readable and
has correct end of line marker
94Data Flow Testing (contd)
- Board submits test file to ODMH
- Must submit via FTP to designated test directory
on ODMH server - Can submit file at any time, any day
- Board submits Data Flow Test Request Form to ODMH
Outcomes Support Team - Must submit via fax or e-mail
- File wont be tested without this form
95Data Flow Testing (contd)
- Outcomes Support Team tests file
- Critical errors in test file
- Filename errors, unknown provider
- Critical errors in test record
- Invalid UCI or UPID, invalid dates
- Information/verify errors in test record
- Provided for QI purposes
96Data Flow Testing (contd)
- Outcomes Support Team notifies board of test
results - Board is notified via e-mail within 7 days of
test request - General errors in test file
- Approved/failed status of test file
- Test results report is placed in boards
designated reports directory on ODMH server - Frequency tables, descriptive statistics
- Critical and information/verify errors
97Data Flow Testing (contd)
- Board is responsible for sharing test results
with provider - Board and provider should work together to
resolve data flow issues at local level - Contact Outcomes Support Team for help!
98Production Data Flow
- Records are added to statewide Outcomes database
- Provider creates file, sends it to board
- Must be approved for production before
submitting data to statewide database - Method and frequency of transfer is local
decision - Board conducts minimal tests on file
99Production Data Flow (contd)
- Board submits production file to ODMH
- Must submit via FTP to designated input directory
on ODMH server - Can submit file at any time, any day
- Boards do not need to submit a Data Flow Test
Request Form for production files
100Production Data Flow (contd)
- Outcomes production staff processes files
- Processing occurs every Monday (or next business
day in the case of a state holiday) - Critical errors in production file
- Filename errors, unknown provider
- Critical errors in production record
- Invalid UCI or UPID, invalid dates
- Information/verify errors in production record
- Provided for QI purposes
101Production Data Flow (contd)
- Outcomes production staff notifies board when
production reports are available - Board is notified via e-mail
- Production report is placed in boards designated
reports directory on ODMH server - Frequency tables, descriptive statistics
- Critical and information/verify errors
102Current StatusOutcomes Data Flow Testing
- 1393 test records processed
- Test data received from
- 43 boards
- 208 agencies
- 94 of agencies that have tested vs. those we
expect based on Incentive Grant
103Current StatusOutcomes Production Data
- 231,600 records in statewide production database
- 3,773 records added on 12/8
- 86,070 unique consumers in database
- Production data received from
- 27 boards
- 146 agencies
104Current Data Flow Reports
- Test Status Report
- Board Production Status Report
- Board Production Duplicates Report
- Provider Production Status Report
- Missing Data Report
105We Have Data. Now What?
106Using Statewide Aggregate Data
- Comparative reports
- Statewide Outcomes Data Reports Workgroup
- Produced by ODMH each quarter
- Alternate between state of the state report and
special topic report - Provide constituents in the mental health system
with statewide data that they can use to compare
an individuals scores or average agency or
board area scores
107Using Statewide Aggregate Data
- Outcomes Data Mart
- A statewide committee has been convened to advise
ODMH on the format and content of a Data Mart
based on the statewide Consumer Outcomes database - Currently in the general design phase
108Need More Information about Outcomes Data Flow?
- Outcomes Initiative Web Site
- http//www.mh.state.oh.us/initiatives/outcomes/out
comes.html - Outcomes E-mail List
- Outcomes Support Team
- outcome_at_mh.state.oh.us
- (614) 644-7840
- Other providers and boards
109The Ohio Mental Health Consumer Outcomes
Initiative
- Mastering the Outcomes Data Flow Process
- Stacy Keenan, M.S.
- Geoff Grove, M.A.
- ODMH Outcomes Support Team