Title: Taking Your First Steps into Program Evaluation Presented by Barri B' Burrus, Ph'D' Presented to Off
1Taking Your First Steps into Program
EvaluationPresented byBarri B. Burrus, Ph.D.
Presented toOffice of Adolescent Pregnancy
Programs 2006 Annual Care Grantee Conference
3040 Cornwallis Road P.O. Box 12194
Research Triangle Park, NC 27709
Phone (941) 486-0245
e-mail barri_at_rti.org
Fax (941) 480-0244
RTI International is a trade name of Research
Triangle Institute
2Session Overview
- Perspectives Assessment
- Introductions
- Building a strong and effective partnership
between program director and evaluator - Developing useful logic models
- Ensuring a rigorous evaluation design
3Evaluators from Venus/Program Directors from Mars?
- What are common barriers that keep program
directors and evaluators from working together
effectively? - At what point do the program staff and evaluators
typically come together at the same table? - How is sufficient detail communicated to the
evaluators so that this information can be
incorporated into the evaluation plan? - What are the possible miscommunications or
pitfalls between program and evaluation staff
when planning an evaluation?
4Harnessing Program Evaluation
5The Inherent Conflict Between Program and
Evaluation Staff1
- Goals and purposes of programs
- Provide direct services to clients
- Address educational needs of clients
- Achieve designated outcomes
- Help agency and organization expand through
increased funding - Ensure sustainability
- Inform the field
- Lieberman, L. Healthy Concepts, Inc.. 2003
6Goals and Purposes of Evaluations1
- Goals and purposes of evaluations
- To determine who is receiving what
- To measure the achievement of identified outcomes
- To improve a programs ability to meet the needs
of its clients - To demonstrate effectiveness/value of programs to
funders - To inform the field about effective programs
7Places where those goals intersect1
- Providing/documenting services
- Demonstrating achievement of specific outcomes
- Demonstrating values to funders
- Demonstrating lessons learned to the field
8Places where those goals may conflict1
- Where data collection interferes (or is perceived
to interfere) with program activities - When data do not demonstrate positive outcomes
- When some outcomes have not been measured
- When program staff and evaluators do not
understand, value, or utilize each others role,
expertise, or experiences
9Results Barriers
- Different language
- Different objectives
- Different priorities
- Different agenda
10Solutions
- What are some solutions or strategies for
building effective relationships between program
and evaluation staff?
11Possible Solutions for Developing A Strong
Program/Evaluation Partnership
- Maintain clear communication and detailed
planning about the program in the development
stages - Program theory
- Intended program outcomes
- Evaluation questions
- Logic Model
- Intended uses for the information
- Establish clear (and high) expectations for the
evaluation - Use of a comparison group with random assignment
- Participation rates
- Quality and timelines for products and
deliverables
12Additional Solutions for Developing A Strong
Program/Evaluation Partnership (cont.)
- Work to maintain ongoing collaboration
- Regular meetings and updates
- Communication and plan updates (with correction
as needed) - Stay flexible
- Evaluator may need to make modifications in
response to program changes - Program may find evaluation inconvenient (e.g.,
need to wait for baseline data to be collected)
13Additional Solutions for Developing A Strong
Program/Evaluation Partnership (cont.)
- Keep stakeholders informed about
- Barriers encountered
- Successes achieved
- Maintain focus on the contributions of evaluation
- To the program
- To the field
14Celebrate the Differences as Means to a Common
Goal!
- Natural conflict of interest between program
staff and evaluators - Program staff-passionate about serving
participants - Evaluators-passionate about generating objective
data to show if it works, if not, why not, and
what could be changed for the future - Both ultimately working towards the same goal to
provide the best services possible for pregnant
and parenting adolescents
15Developing Useful Logic Models
16Logic Models
- A systematic process for thinking through detail
and sharing information about planned program,
intended results, and the relationship between
the two - A program and evaluation roadmap
- Logic model must match your research questions
- Should guide instrument development and all
constructs to be tested
17Logic Models Common Barriers
- Seem like busy work without value
- Severe case of logic model phobia
- Challenge of sitting down to plan detail when
staff really want to be doing the program
18Logic Models Advantages
- Gives you a blueprint or road map
- Check for program courseare you on track?
- Encourages reassessment and tracking changes (NOT
a static model) - Communication tool between program and others
(including evaluators)-establishes what to
evaluate
19Typical Logic Model Components
- Resources/Inputs
- Activities what the program does with its
resources, the actions that are expected to
create the outcomes - Outputs products of the program activities may
include size and scope of services and products
delivered. Helps determine if program delivered
to intended audience at intended dose (informs
process evaluation) - Outcomes changes resulting from activities that
are usually at the individual level - Impacts organizational, national, systems level
changes
20Detail Needed for Logic Models
- Need a filing system to take maximum advantage
of logic models - Within each component of the logic model,
organize by boxes - Organizational scheme is driven by presumed
causal linkages
21Additional Recommendations to Improve Logic
Models for Planning and Evaluation Purposes
- Break out to show detail (may need 1 overall
model and then blow-up versions with finer
detail) - Add program assumptions/theory
- Include key evaluation variables/questions
- Add possible mediating and moderating variables
22Logic Model Example
Goals
Outcomes
Outcomes
Outputs
Activities
- Teacher Characteristics
- Improved interactions
- with adolescent
- Positive messages
- about adolescents
- capabilities
Improved adolescent self-efficacy to succeed
academically
Longer adolescent stay in school
Academic case management
sessions teachers Session Integrity
Mediating Effects
Improved adolescent behavioral capability to use
contraception and negotiate with partner
Increased adolescent contraceptive use
Reduced adolescent repeat pregnancy
Family planning counseling
Participation info Content info
Improved adolescent outcome expectations about
immunizations
Grandparent support group
- Grandparent Characteristics
- Increased knowledge about
- immunization benefits
- Increased skills for avoiding
- conflict with adolescent
Increased adolescent contraceptive use
Increased immunizations
Participation info Meeting Info Leadership
Moderating Effect
- Demographic
- characteristics
- Family dysfunction
- Adolescent age at
- first pregnancy
Process Evaluation
Outcome Evaluation
23Putting Logic Models to Work
- Dont be intimidatedtheres no one right way
- Ensure program activities and theory are linked!
- Use outputs to guide process evaluation and
assess dose - Make certain the key outcomes for evaluation are
clear and linked to program activities
24Ensuring a Rigorous Program Design
25Basic Expectations of Sound Evaluation
- End of year evaluation report templatea detailed
guide - Clear, detailed understanding of the program
- Theoretical basis for behavior change, captured
in logic model and instrumentation - Program objectives
- Specific
- Measurable
- Achievable (based on literature or existing data)
- Realistic (based on literature or existing data)
- Time-framed (appropriately to literature,
existing data, and data collection schedule) - Clearly articulated evaluation research questions
- Process evaluation (to facilitate replication,
understanding of outcome evaluation findings, and
program improvement)
26Evaluation Design
- Appropriate to answer evaluation research
questions - Begin with most rigorous design possible
- Randomized experimental design is the gold
standard to answer research questions about
program effectiveness - Units for study (such as individuals, schools,
clinics, or geographical areas) are randomly
allocated to groups exposed to different
treatment conditions
27Benefits of Randomized Experimental Design
- Able to infer causality
- Assures the direction of causality between
treatment and outcome - Removes any systematic correlation between
treatment status and both observed and unobserved
participant characteristics - Permits measurement of the effects of conditions
that have not previously been observed - Offers advantages in making results convincing
and understandable to policy makers - Policymakers can concentrate on the implications
of the results for changing public policy - The small number of qualifications to
experimental findings can be explained in lay
terms - (Bauman, Viadro, Tsui, 1994 Burtless, 1995)
28Strategies for Implementing Randomized
Experimental Design
- Read methods sections from evaluations using
randomized experimental design - Ask for evaluation technical assistance to
implement this design - Recruit all interested adolescents
- Ask parents/adolescents for permission to
randomly assign to one of two conditions - Divide program components into two conditions
- Overlay one component on top of others
- Focus outcome evaluation efforts on randomly
assigned adolescents - Include all adolescents in process evaluation
29Obtaining and Maintaining a Comparison Group
- Emphasize the value of research
- Explain exactly what the responsibilities of the
comparison group will be - Minimize burden to comparison group
- Ask for commitment in writing
- Provide incentives for data collection
- Provide non-related service/materials
30Obtaining and Maintaining a Comparison Group
(cont.)
- Meet frequently with people from participating
community organizations and schools - Provide school-level data to each participating
school (after data are cleaned and de-identified) - Work with organizations to help them obtain
resources for other health problems they are
concerned about - Add questions that other organizations are
interested in - Explain the relationship of this project to the
efforts of OAPP - Adapted from Foshee, V.A., Linder, G.F., Bauman,
K.E., Langwick, S.A., Arriaga, X.B., Heath, J.L.,
McMahon, P.M., Bangdiwala, S. (1996). The Safe
Dates Project Theoretical basis, evaluation
design, and selected baseline findings. American
Journal of Preventive Medicine, 12, 39-47.
31Barriers to Randomized Experimental Design
- Costs
- Consume a great deal of real resources
- Costly in terms of time
- Involve significant political costs
- Ethical issues raised by experimentation with
human beings - Limited in duration
- High attrition in either the treatment or control
groups - Population enrolled in the treatment and control
groups not representative of the population that
would be affected by the treatment - Possible program contamination across treatment
groups - Lack of experience using this design
- (Bauman, Viadro, Tsui, 1994 Burtless, 1995)
32Sampling Plan
- Get expected effect size or change in percentages
from literature - For all assumptions, err on the conservative side
if you have not produced effects or retained
participants in a longitudinal data collection
effort before - Conduct a power analysis to determined how many
adolescents you will need final completed
instruments for at follow-up to detect this
effect - Project expected loss to follow-up between
baseline and follow-up - Project expected lack of participation at
baseline - Project parent consent rate
- From this, identify initial pool of subjects
33Retention
- At the beginning of the study, ask for the name
and contact information of someone who will
always know where the adolescent is - Keep in touch with the sample (e.g., postcards to
update addresses) - Increase incentives or tracking efforts if
response rates begin to decline - Use the Web and post office to obtain current
tracking information
34Other Basic Expectations of Sound Evaluation
- Clear description of comparison group
- Instrument development
- Measures adapted from reliable, validated
instruments - Includes measures corresponding to theoretical
risk/protective factors and outcomes - Tested for appropriate reading level
- Pilot tested
35Other Basic Expectations (continued)
- High quality data collection process
- Data collected by objective personnel blinded to
treatment condition - Measures to assure confidentiality and promote
disclosure - Quality assurance procedures to improve data
reliability and validity - Confidentiality procedures prior to, during, and
after data entry
36Other Basic Expectations (continued)
- Appropriate data analysis procedures
- Assessment/handling of missing data
- Assessment of baseline differences between
treatment groups - Attrition analysis
- Multivariate analysis controlling for variables
associated with baseline differences and
attrition
37Other Basic Expectations (continued)
- Reporting results
- Thorough but concise reporting of findings
- Linked to program objectives and research
questions - Discussion of design limitations
- Appropriate interpretation of findings
- Dissemination of findings through publication
- Ongoing communication and problem-solving between
evaluation staff and program staff
38Action Steps
- What are some strategies for ensuring the design
and implementation of a rigorous evaluation
design that can be put into place for your
evaluation? - What kinds of technical assistance do you need to
make it work? - Evaluation TA from RTI is available and can be
accessed by contacting your Project Officer.