Developing and Implementing Multilevel Program Evaluation Plans for SAT-ED Grants - PowerPoint PPT Presentation

1 / 43
About This Presentation
Title:

Developing and Implementing Multilevel Program Evaluation Plans for SAT-ED Grants

Description:

... Continuing Care $34,323 $17,000 $17,323 Long-Term Residential $27,489 $26,656 $833 Short-Term Residential $25,255 $21,900 $3,355 Total $15,633 $13,642 ... – PowerPoint PPT presentation

Number of Views:240
Avg rating:3.0/5.0
Slides: 44
Provided by: PamelaI150
Category:

less

Transcript and Presenter's Notes

Title: Developing and Implementing Multilevel Program Evaluation Plans for SAT-ED Grants


1
Developing and Implementing Multilevel Program
Evaluation Plans for SAT-ED Grants
March 11, 2013 Michael L. Dennis Chestnut
Health Systems, Normal, IL Available from
www.gaincc.org/presentations
  • Created for Substance Abuse and Mental Health
    Services Administrations (SAMHSA) Center for
    Substance Abuse Treatment (CSAT) under contract
    number HHSS283200700003I, Task Order
    HHSS28300002T

2
Goals for the Presentation??
  1. Summarize the key problems in our field that
    SAT-ED is attempting to address
  2. Review objectives, key questions, and sources of
    data to be addressed in the evaluation
  3. Identify key steps in designing, implementing,
    and using evaluation to help manage and improve
    programs
  4. Discuss strategies for reliable, valid, and
    efficient collection and analysis of state
    (including commonwealth), site, and client-level
    data
  5. Provide links to further resources and training

3
Objectives of SAT-ED
  • To improve treatment for adolescents through the
  • Development of a Learning Laboratory with
    collaborating local community-based treatment
    provider sites.
  • Improvements in State-level infrastructure
    through workforce development, financial
    planning, licensure, and certification
  • Improvement of site-level infrastructure through
    implementation of evidence-based practice (EBP)
    related to assessment and treatment
  • Assessment, treatment, and monitoring of change
    at the client level

4
Typical Components of a Multilevel Evaluation
Plan
  1. Needs assessment
  2. Description of program activities, Theory of
    Change, and/or Logic Model
  3. Approach to stakeholders
  4. Evaluation questions, data sources, and
    methodology
  5. Performance monitoring and reporting

TIP Labels and the order of components can vary
to fit your situation, the point here is really
to make sure that you have them covered or that
your team makes an informed decision not to
address them
5
1. Needs Assessment
  • Description of infrastructure and site level
    needs and what information is still needed
  • Local system and/or cultural consideration
  • Articulate the rationale for the selection of the
    targeted
  • Infrastructure activities
  • Site selection
  • Evidence-based assessment selection
  • Evidence-based treatment selection

6
Structural Challenges to Delivery of Quality Care
in Behavioral Health Systems
  1. High-turnover workforce with variable education
    background related to diagnosis, placement,
    treatment planning, and referral to other
    services
  2. Heterogeneous needs and severity characterized by
    multiple problems, chronic relapse, and multiple
    episodes of care over several years
  3. Lack of access to or use of data at the program
    level to guide immediate clinical decisions,
    billing, and program planning
  4. Missing, bad, or misrepresented data that need to
    be minimized and incorporated into
    interpretations
  5. Lack of infrastructure that is needed to support
    implementation and fidelity of evidence-based
    practices that have been shown to work better on
    average

7
Substance Use Disorder and Treatment by Age
7
Higher rates of unmet need for adolescents and
young adults
Higher rates of need for young adults
Source SAMHSA 2009 National Survey on Drug Use
and Health
8
Substance Use Disorder and Treatment by Age
8
Completion rates are lower for adolescents and
young adults
Lengths of stay are shorter for young adults and
adults
Source SAMHSA 2009 Treatment Episode Data Set
Discharges (TEDS-D)
9
No Self-Help Group Participation in the First 3
Months of Treatment
Age
Higher adolescents and young adults
plt.05
SAMHSA 2011 GAIN SA Data Set subset to has 3m
Follow up (n21,228)
10
Unmet Need for Mental Health Treatment by 3
Months
Age
Higher for adolescents and young adults
plt.05
SAMHSA 2011 GAIN SA Data Set subset to has 3m
Follow up (n14,358)
11
Unmet Need for Medical Treatment by 3 Months
Age
Higher for Young Adults
plt.05
SAMHSA 2011 GAIN SA Data Set subset to has 3m
Follow up (n8,517)
12
2. Description of Program Activities, Theory of
Change, and/or Logic Model
  • Describe infrastructure and site-level activities
    to be conducted and any specific programs or
    evidence-based practices you plan to use
  • Theory or logic model for each need and how they
    will be addressed by the activity and the
    expected outcome
  • Discuss relationship between various needs,
    activities, or components, including how State-
    and site-level activities support each other

13
Expected State-Level Infrastructure Activity
  1. Interagency workgroup to improve the statewide
    infrastructure for adolescent substance abuse
    treatment and recovery
  2. Memoranda of understanding between SAT-ED awardee
    agency and other child-serving agencies
  3. Multiyear workforce training plan for specialty
    adolescent behavioral health (substance use
    disorder/co-occurring substance use and mental
    disorder) treatment/recovery sector and other
    child-serving agencies
  4. Comprehensive and integrated continuum of care
    for adolescents with substance use and mental
    health disorders in terms of both funding and
    services

14
Expected State-Level Infrastructure Activity
(continued)
  1. Financial mapping to understand current funding
    and coverage
  2. Coordination of funding to make the system more
    efficient, expand coverage, and shift towards
    more effective practices
  3. Facilitation of a learning laboratory to use
    above to identify target areas of need, attempt
    change, evaluate the change, and if necessary
    adjust strategies to improve the quality of care

TIP Can relate to and/or build on activities
already under way. You just want to be sure you
will be prepared to address each area in your
annual and final progress reports
15
Other Allowable State-Level Infrastructure
Activity
  1. Workforce mapping to understand qualifications of
    staff across the continuum of care and the
    adequacy of the initial training/ continuing
    education infrastructure already in place
  2. College, university, and continuing education
    staff and programs/faculty infrastructure
    improvements/expansions and number of
    new/existing staff trained
  3. Other statewide events to provide continuing or
    community education/training
  4. Reviewing/revising PROGRAM standards for
    licensure, certification, and/or accreditation of
    programs that provide substance use and
    co-occurring mental disorders services for
    adolescents and their families
  5. Reviewing/revising CLINICAN standards for
    licensure, certification, and/or credentialing of
    clinicians that provide substance use and
    co-occurring mental disorders services for
    adolescents and their families

16
Other Allowable State-Level Activity (continued)
  1. Family/youth support organization creation,
    expansion, continuation, or enhancement
  2. People newly credentialed/certified to provide
    substance use and co-occurring substance use and
    mental health disorders
  3. Policy changes made as a result of the
    cooperative agreement
  4. Financing policy changes completed as a result of
    the cooperative agreement

TIP Choose what makes sense for your needs and
proposed activities. Invest more in measuring
those areas where you are focusing your resources
and attention. There is less interest in the
average than identifying and understanding one or
more areas where grantees that have done
something they found useful.
17
Expected Site-Level Infrastructure Activity
  1. Collaborating sites you have contracting with to
    provide evidence practice practices (EBPs)
  2. EBP related to (a) assessment and (b) treatment
    for which you have contracting to obtain at
    training and technical support to implement
  3. EBP training type, date, and number staff
    attending each
  4. EBP-proficient staff capacity with regard to the
    number of employed staff who are certified by
    level and type of EBP
  5. EBP local trainer or supervisory capacity with
    regard to the number of employed staff who are
    certified by level and type of EBP train and
    supervise new staff

18
Other Optional Site-Level Infrastructure Activity
  1. Implementation of EBP related to assessment in
    terms of the number completed, linkage to medical
    records, use of clinical decision support, use
    for program planning (aka meaningful use)
  2. Expansion of coverage based on number and
    percentage of assessed youth receiving any
    services billed to insurance (Medicaid, CHIP,
    other Federal/State, other private) instead of
    the block grant
  3. Implementation of EBP related to treatment in
    terms of the number of clients receiving it and
    receiving target dosage

TIP Be sure to think about how to describe,
measure, and demonstrate a relationship between
State- and site-level activities related to the
chosen EBP. Collaborate with other State using
the same EBP.
19
Comparison of Site EBP for Assessment
Evidence-Based Practice IA IL IN KY LA MA ME MT NY OK PR SC WA
Comprehensive Adolescent Severity Index (CASI) X X ?
Global Appraisal of Individual Needs (GAIN) X X X X X X X X ? X X
Government Performance and Result Act (GPRA) X X X X X X X X X X X X X
TIP Most States/sites have other electronic or
hardcopy records and have mentioned additional
measures in their proposal or preliminary
evaluation plans. Most sites are still in the
process of deciding whether to conduct followup
with EBP or other measures beyond GPRA.
20
Comparison of Site EBP for Treatment
Evidence-Based Practice IA IL IN KY LA MA ME MT NY OK PR SC WA
Adolescent Community Reinforcement Approach (A-CRA) X X X X X X ? X X
Intensive Community Treatment (ICT) X
Multidimensional Family Therapy (MDFT) X X
Multisystemic Therapy (MST) ?
Seven Challenges (7C) X
TIP Several States have talked about comparing
to other EBP within their State, comparing to the
same EBP in other sites, and/or expanding EBP to
other sites.
21
3. Approach to Stakeholders
  • Identification of State-, site-, community-, and
    individual- (youth, family) level holders
  • Coordination with or creation of strategic
    planning groups or interagency councils
  • Coordination with electronic medical and billing
    records
  • Involvement of program directors, information
    technology staff, clinical directors,
    supervisors, line staff
  • Coordinating with or creation of community,
    family, and/or youth advisory groups or
    partnerships

22
Questions for Stakeholders
  • Key needs or problems with the current system
    that might be addressed
  • Critical time lines, measures, and products that
    would make it more useful to them
  • Recognizing how they define and measure things
    and where multiple definitions or measures may be
    needed across stakeholders
  • What will it take for them to support
    sustainability beyond the grant?

23
Identifying and Addressing Key Subgroups That May
Have Concerns or Barriers to Accessing Services
  • Demographic groups (e.g., by gender, race,
    ethnicity, age, sexual orientation)
  • Abilities (e.g., hearing, sight, mobility, IQ)
  • Clinical subgroups such as
  • Primary substance,
  • Co-occurring mental health/trauma/suicide
  • Crime/violence or justice involvement
  • Degree of family support and use
  • Insurance, transportation, or economic

TIP Health disparities need to be treated
similarly to safety issues?where best practice is
to diligently look for them and work toward
reducing them wherever possible to improve
effectiveness and reduce liability.
24
4. Evaluation Questions, Data Sources and
Methodology
  • Operationalizing the program objectives/
    questions into activities, measures of
    implementation/outputs, and outcomes, including
    the frequency of collection and data sources
  • Working backwards to make sure the above
    crosswalk maps onto actual contracts, memos of
    understanding, and/or expectations of all
    stakeholders (many of which are developed at
    different points in the proposal and startup
    process)

25
State-/Site-Level Infrastructure
  • Often a matter of documenting what has been done,
    including dates, type and events, number of
    staff, degree of completion/certification
  • Dual Diagnosis Capability in Addiction Treatment
    (DDCAT) and Dual Diagnosis Capability Youth
    Treatment Tool (DDCYT) measures of availability
    and quality of co-occurring services
  • Identifying how things differ from what was
    expected, including
  • Unexpected problems and how they were addressed
  • Unexpected opportunities and how they were seized
  • Things that still need to be or might be done

26
Common Client-Level Questions
  • What are the characteristics and needs of those
    who were served?
  • What services did they receive?
  • To what extent are services targeted at the most
    appropriate for severe clients?
  • To what extent are services effective?
  • Are the services cost-effective?

TIP Not every evaluation will address each of
these questions or each question equally well.
The point here is to think about how and how well
you will be able to answer each.
27
Characteristics and Needs of Those Served
Measure GPRA GAIN CASI
a. Demographics, veterans, housing, justice, and vocational status X X
b. Sexual orientation X
c. Current substance use, mental health, health, and HIV risk behavior X X X
d. Withdrawal, substance use disorder history and diagnosis X X
e. Internalizing and externalizing psychiatric history and diagnosis X X
f. Physical health history, disabilities, infectious disease X X
g. History of HIV risk behaviors and victimization X X
h. Strengths, family, and environment X X
i. Current arrest, school, employment X X X
j. Incarceration, arrest, and illegal activity history X X
k. Cost to society of health care utilization and crime X
l. Treatment planning and level of care placement X X
No veteran status Only one question on trauma No strengths No veteran status Only one question on trauma No strengths No veteran status Only one question on trauma No strengths No veteran status Only one question on trauma No strengths
28
What Services Did They Receive?
Measure GPRA GAIN CASI Records
a. Initiating treatment within 2 weeks of diagnosis X X
b. Engagement in treatment for at least 6 weeks X X X
c. Continuing care more than 90 days past intake X X X
d. Level of care and type of evidenced-based practice X X X
e. Range of services received X X X
f. Early working alliance or satisfaction X
g. Satisfaction with services received X
h. Urine test results X
i. Health disparities on need and targeted services X X X X
Only if followup version is used Only if accessible
TIP Without GAIN/CASI followup, you will be very
dependent on the quality of and access to
records. With them, need to cover first 3 months
to describe most of treatment.
29
To what extent are services targeted at the most
appropriate or severe clients?
  1. Implementation of reliable, valid, and efficient
    measures of need and severity
  2. Consensus standards on definition of need, link
    to services, and/or evidence-based practices
    associated with better outcomes on average
  3. Implementation of clinical decision support and
    meaningful use to drive actual treatment planning
    and services
  4. Evaluation of treatment need profiles, gaps, and
    health disparities and the program level and
    monitoring of change over time

30
To what extent are services effective?
  1. Improvements in administrative outcomes (e.g.,
    initiation, engagement, continuing care,
    evidence-based practices) associated with better
    outcomes on average
  2. Participation in self-help and recovery support
    services
  3. Among those in need, receipt of services related
    to co-occurring mental health and physical health
    problems
  4. Pre-post change in percent of past month
    abstinence, no substance-related problems, no
    justice involvement, being housed, vocational
    engagement, and social connectedness
  5. Comparison of the same program over time, across
    sites, to other programs, national norms, or
    standards (ideally matched programs or clients)

TIP These have to come from records or
supplemental data such as followup data.
31
Are the services cost-effective?
  1. Estimate costs of average services and
    evidence-based practices using accounting data
  2. Compare costs to statewide, Federal, or published
    normative costs overall or adjusting for improved
    retention
  3. Putting costs in context relative to baseline
    costs to society of health care utilization or
    crime and the extent to which the program is
    targeting a high cost subgroup
  4. Pre-post change in the cost to society of health
    care utilization or crime

TIP These have to come from records, followup,
or other supplemental data such as followup data.
32
5. Performance Monitoring and Reporting
  • Early indicators of implementation, fidelity, and
    steps of the theory of change or logic model
  • Important for infrastructure measures to include
    necessary steps (e.g., selection, contracting,
    events, people, evaluations)
  • Client-level measures related to
  • Recruitment and data collection rate/target,
    being on time
  • Case mix of those served
  • Treatment initiation, engagement, continuing
    care, satisfaction
  • Fidelity of EBP
  • Services targeted at needs

33
Implementation Is Essential (Reduction in
Recidivism from .50 Control Group Rate)
Thus, one should optimally pick the strongest
intervention that one can implement well
Source Adapted from Lipsey, 1997, 2005 meta
analysis of 509 juvenile justice programs
34
What gets measured, gets doneWhat gets fed back,
gets done betterWhat gets incentivized, gets
done more often
Average practice based on TEDS
Based on a count of initiation within 14 days,
evidence based practice, engagement for at least
6 weeks, and any continuing care.
Source CSAT 2011 AT SA Data Set subset to 1
Follow ups (n17,202)
35
Selected NOMS Outcomes Over Time
Most effects are in the first 90 days, important
to measure outcome and services received by then
Variation in outcomes
Interpolated Past month
Source CSAT 2011 AT SA Data Set subset to 1
Follow ups
36
NOMS Outcome Status at Last Wave
Measure favors people who come in the door
without problems
This variable measures the last 30 days. All
others measure the past 90 days. The blue bar
represents an increase of 50 or no problem.
Source CSAT 2011 AT SA Data Set subset to 1
Follow ups
37
NOMS Outcomes Count of Positive Outcomes
(Status at Last Followup Status at Intake)
78 have one or more improved areas
Source CSAT 2011 AT SA Data Set subset to 1
Follow ups (n17,722)
38
Health Care Utilization Cost
11 of youth consume 76 of health care costs
Source CSAT 2011 AT Summary Analytic Data Set
(n19,148)
39
Cost of Crime
21 of youth consume 97 of health care costs
Source CSAT 2011 AT Summary Analytic Data Set
(n17,878)
40
Reduction in Health Care utilization off set the
cost of SUD Treatment within 12 months
Adolescent Level of Care Year before intake Year after Intake a One Year Savings b
Outpatient 10,993 10,433 560
Intensive Outpatient 20,745 15,064 5,682
Outpatient Continuing Care 34,323 17,000 17,323
Long-Term Residential 27,489 26,656 833
Short-Term Residential 25,255 21,900 3,355
Total 15,633 13,642 1,992
\a Includes the cost of treatment \b Year after
intake (including treatment) minus year before
treatment
41
EBP like A-CRA Cost More but Produce Greater
Savings Too
\a Includes the cost of treatment \b Year after
intake (including treatment) minus year before
treatment
42
Impact of Reclaiming Futures Infrastructure
Enhancements to Juvenile Treatment Drug Court on
Cost of Crime to Society
\a RF-JTDC is significantly lower at follow-up
than JTDC. Source Dennis et al 2012
43
Other Evaluation Training Resources
  • ACYFs The Program Manager's Guide to Evaluation
    http//www.acf.hhs.gov/programs/opre/research/proj
    ect/the-program-managers-guide-to-evaluation
  • American Evaluation Association
    http//www.eval.org/
  • BJAs Program Evaluation Manual
    https//www.bja.gov/evaluation/guide/bja-guide-pro
    gram-evaluation.pdf
  • CDCs resource page on program evaluation and
    logic model development http//www.cdc.gov/eval/re
    sources/index.htm
  • CSAP Pathways Course Evaluation 101
    http//pathwayscourses.samhsa.gov/eval102/eval102_
    1_pg2.htm
  • Evaluators Institute http//tei.gwu.edu/
  • GAOs Designing Evaluations http//www.gao.gov/pr
    oducts/GAO-12-208G
  • GAIN Program Management and Evaluation Training
    (PMET) http//www.gaincc.org/products-services/tra
    ining/gain-program-management-and-evaluation-train
    ing/
  • NIAAAs State-of-the-art methodologies in
    alcohol-related health services research
    http//onlinelibrary.wiley.com/doi/10.1111/add.200
    0.95.issue-11s3/issuetoc
  • NIDAs Blue Ribbon Task Force on Health Services
    Research www.drugabuse.gov/sites/default/files/fil
    es/HSRReport.pdf
  • NSFs User Friendly Handbook http//www.nsf.gov/p
    ubs/2002/nsf02057/start.htm
  • SAMHSA Center for Behavioral Health Statistics
    and Quality (CBHSQ) national data sets with
    information on need http//www.samhsa.gov/data/
  • SAMHSA NREPPs Non-Researcher's Guide to
    Evidence-Based Program Evaluation
    http//nrepp.samhsa.gov/Courses/ProgramEvaluation/
    NREPP_0401_0010.html
  • SAMHSA TIP 14 State Outcomes-Monitoring Systems
    for Alcohol and Other Drug
    Abuse Treatment http//store.samhsa.gov/prod
    uct/TIP-14-State-Outcomes-
  • Monitoring-Systems-for-Alcohol-and-Other-Drug-Abus
    e-Treatment/BKD162
Write a Comment
User Comments (0)
About PowerShow.com