Title: Strategies for Success even on a shoestring budget
1Evaluation and the Public Health Practitioner
- Strategies for Success (even on a
shoestring budget)
Laura A. Linnan, ScD, CHES UNC Chapel Hill School
of Public Health, Center for Health Promotion
Disease Prevention Healthy Carolinians Annual
Conference October 6, 2006
2Objectives
- Discuss the importance of doing program
evaluation when attempting to address and
eliminate disparities in health - Define evaluation and describe three types of
evaluation - Apply this knowledge to an example given as part
of the presentation - Identify typical challenges related to program
evaluation - Discuss how to evaluate on a shoestring budget
and share evaluation needs
3Why Evaluate?
- Contribute to improvements in programs
- more effective strategies re process of
intervening - revise or change program content to create better
outcomes for intended audience - understand resource/cost implications
- mobilize political will
- Contribute to improvements in policy
- powerful implications for social change
4Why Evaluate?
- To provide accountability to funders, community
groups or other stakeholders - To increase community support for program or
initiative - To remove support from ineffective programs
- To contribute to scientific base for public
health interventions - To clarify where disparities in health exist and
advocate for addressing/eliminating them
5Planning for Evaluation
- All health educators are trained to use planning
modelsbut. How many use them in practice? - MATCH
- PATCH
- PRECEDE-PROCEED
6PHASE 2 Intervention Planning
PHASE I Health Goals Selection
Select Intervention Approaches
Identify Targets of Intervention
Select Intervention Objectives
- INFLUENCE GOVERNMENTS
- Political process -Social action
-Social change
-Community development
GOVERNMENT AND COMMUNITY LEADERS -Legislators,
regulators, enforcers -Agency
administrators -Community organization leaders
HEALTHFUL GOVERNMENTAL -Policies/legislation -Enf
orcement -Regulation -Resource allocation -Program
s, facilities
PHASE 34 Program Development Implementation
INFLUENCE ORGANIZATIONS -Organizational change
-Consulting
-Training
-Networking
ORGANIZATION DECISION MAKERS -Administrators/Mana
gers -Internal change agents -Workers/employees -U
nion members and leaders
HEALTHFUL ORGANIZATIONAL -Policies and
practices -Programs -Facilities -Resources
INFLUENCE INDIVIDUALS -Screening
-Medical care -Health
education -Counseling
-Persuasive communications
INDIVIDUALS AT RISK -Patients -Clients -Students -
Employees -Residents
INDIVIDUALS REDUCED RISK FACTORS -Behavioral -Ph
ysiological -Physical
- HEALTH STATUS OF TARGET POP.
- Morbidity -Mortality
-Wellness
Outcome evaluation
Process evaluation
Impact evaluation
Phase 5 Evaluation
7PRECEDE-PROCEED Planning Model
Phase 1 Social Diagnosis
Phase 3 Behavioral Environmental Diagnosis
Phase 2 Epidemiological Diagnosis
Phase 4 Educational Organizational Diagnosis
Phase 5 Administrative Policy Diagnosis
Predisposing Factors
HEALTH PROMOTION
Behavior Lifestyle
Health Education
Quality of Life
Reinforcing Factors
Health
Policy Regulation Organization
Environment
Enabling Factors
Phase 6 Implementation
Phase 7 Process Evaluation
Phase 8 Impact Evaluation
Phase 9 Outcome Evaluation
8Planning for Evaluation
Developmental Component
Circular Loop data from the outcome/impact/proc
ess evaluation feeds back into the development
of new programs and new evaluation efforts
Formative Evaluation
Impact/ Outcome Evaluation
Process Evaluation
9Defining Evaluation
- The systematic assessment of the operation and/or
the outcomes of a program or policy, compared to
a set of explicit or implicit standards, as a
means of contributing to the improvement of the
program or policy - (C. Weiss, 2000)
10Evaluation Design Considerations
- Choose the most powerful design (e.g.minimize
threats to validity) given - Prioritized evaluation questions
- Stakeholder priorities/interests
- Ethical considerations
- Available resources
- Practical considerations re size of project,
number of participants, time, budget
11Evaluation Design Options
- Non-experimental designs
- Record keeping/historical controls
- Inventories maintained over time
- Comparisons with similar/related programs
- Quasi-experimental/controlled comparisons
- Experimental designs with random sampling and/or
allocation to treatment (RESEARCH)
12Standards of Comparison
- Compare results with existing theory, literature,
documents, or other (a priori) specified set of
expectations which should be stated in the
program objectives - Compare results against effect size estimates
from previous studies - Compare results with national/state health
objectives for the nation - Consult Guide to Community Preventive Services
13Operations and/or Outcomes?
- Study operations - usually involves process or
formative evaluation - What do potential participants want/need?
- What was the process of conducting the program?
- Was the intervention delivered as planned?
- Study outcomes - usually involves impact or
outcome evaluation - What effect did the program/intervention have on
the intended audience?
14Types of Evaluation
- Formative
- Outcome
- Realize there are a continuum of outcomes (e.g.
short/long-term, primary/secondary,
proximal/distal) related to any intervention - Impact (immediate) vs. outcome (final)
- Is the expected outcome realistic for this
intervention? - Process
15Formative Evaluation
- Designed to assess the strengths and limitations
of ideas or materials or programs before full
scale implementation - Make sure tests occur with intended population
- May be either qualitative or quantitative.
- Timing is important..
16Typical Data Collection Methods
- Surveys written, online, phone
- Interviews in-person, phone, intercept
- Records/archival data
- Focus groups
- Community forums
- Photovoice
- Observations
- Exit polls
- Other?????
17Lets Try An Example
18Intervention Idea Address Cancer Disparities by
Promoting Health in AA Beauty Salons.
- What formative data would you like to have before
you decide this is a good idea to pursue? - When should this formative evaluation be done?
- Who will you talk with?
- What will you ask?
- What methods will you use to gather the data?
19Formative Results Indicate
- Stylists are receptive to the idea and have
shared preferences re training - Customers are willing/interested in getting
health information in the salons - Observations gave insights about how to develop
and deliver appropriate interventions - Next questions What interventions should we
deliver? Are we delivering them as intended? Are
they effective?
20What Interventions Should We Deliver?
- For owners
- Signs, posters, displays
- For stylists
- Training workshops
- Mirror stickers, displays
- For customers
- Messages delivered via brochures
- Messages delivered from stylists
- Messages included in the salon environment
21Outcome Evaluation
- End results or effects of the program for the
intended audience - May address primary and secondary audiences
- May be anticipated or unanticipated
- May be short or long-term outcomes
22Draw a Visual Representation of Your Intervention
Expected Effects
23What Effects Do We Expect this Intervention to
Produce?
- Mirror Stickers/Displays
- Increase knowledge of messages
- Increase talk with customers re messages
- Stylist Training Workshops
- Increase knowledge about key messages
- Increase self-efficacy to deliver messages
- Increase readiness to deliver messages
- Increase of messages delivered to customers
24What Type of Outcome Evaluation might be Possible
Given this Intervention?
- What outcome (e.g. primary outcome) can you
expect from this intervention? - What is realistic? How do you know it is
realistic? - What type of data collection methods would you
want to use? - What should the timing of the data collection be?
25Are the Outcomes Realistic?
- What are possible short, interim and long-term
outcomes for this program? - How powerful is this intervention?
- What EVIDENCE exists? What have others
achieved with similar
interventions,
amounts of time, and
resources? -
26What Are Some Possible Designs for Evaluating
These Outcomes?
- Important balancing act..
- What is the design that is most likely to rule
out alternative explanations for the intervention
results you expect and obtain given - Resources
- Time
- Staff experience
- Potential burden to participants
27Design Options To Test Expected Effects of
Stylist Trainings
- Post-test only (knowledge, self-efficacy,
intentions, readiness) among participating
stylists - Pre/post-test among participating stylists
- Pre/post-test among participating stylists and
observations in participating salons - Pre/post-test among participating stylists and
observations in matched participating and
non-participating salons - Trade-offs???? Other options???
28Process Evaluation
- Essential for understanding how the
program/intervention unfolded over time - Essential for understanding the extent to which
the program was delivered as originally planned
or how modified - Clarify how the program was receivedby what
subgroups and to what extent it was received - Helps clarify negative outcomes
- Expands understanding of positive outcomes
- Document costs, resources used, time, etc.
29Process Qs For Stylist Training
- How do we know the stylist training was delivered
as intended? Was it modified and in what way? - To what extent did the stylists receive and
engage with the training? - For whom was the training more or less effective?
How do we know? - What was the quality of the intervention
delivered? How do we know? - What were the costs of the intervention?
30Planning for Process Evaluation
31More on the BEAUTY Project
- North Carolina BEAUTY and Health Project uses
CBPR principles and builds on two years of work
in partnership with salons, stylists customers - BEAUTY Advisory Board (January, 2000 present)
- Survey of licensed stylists
(Linnan, Kim et
al, Preventive Medicine, 2001) - Observations in 10 beauty salons
(Solomon, Linnan et al,
Health Ed Behavior, 2005) - Pilot intervention study in 2 salons
(Linnan, Ferguson et al,
Health Promotion Practice, 2005) - Focus groups with salon customers
(Kim, Linnan et
al, in preparation)
32Evaluation Challenges.
And. What can you do in the planning phase to
avoid breakdowns?
33Practical Challenges of Program Evaluation
- Planner fails to build evaluation into effort
use planning models - Limited stakeholder involvement or politics use
participatory planning and evaluation efforts - Lack of adequate time or resources to evaluate
plan ahead for best results - Multi-level interventions are complex to evaluate
- Changes occur slowly or do not last be
realistic about potential outcomes (short and
long-term) - Sources Solomon, 1987 Glasgow, Vogt, and Boles,
1999
34Evaluation After the Fact.Some Questions to
Consider.
- How long will the program last? Can you measure
short and long-term effects? Periodic
adjustments? - Do you want to repeat or continue the program?
- Is there management support or public demand for
(or criticism of) your program? - Which program components are
- most important? To which stakeholders?
- Is there budget to support the evaluation?
- Who will review the evaluation results?
When?
35The Process of Evaluation
- Uncover/identify program goals/objectives
- Identify interests/needs of key stakeholders
- Determine primary purpose(s) of the evaluation
effort - Clarify specific evaluation questions to be
answered and prioritize with input from key
stakeholders
36The Process of Evaluation
- With key stakeholders, collectively decide on
evaluation design, methods, measurements that
best fit the priority questions - Collect and analyze data, check-in with
stakeholders with preliminary findings - Prepare reports/disseminate
results per previous
arrangements with stakeholders
37DO YOU NEED TO BE AN EVALUATION EXPERT????
38Evaluation on a Shoestring Budget
- Develop partnerships to share costs of the
evaluation - University partners
- Key stakeholders
- Special grants or funding to evaluate
- Leverage pieces of the evaluation
- Pre-testing, formative research, outcome
evaluations
39Need Evaluation Help?
- The Center for Health Promotion and Disease
Prevention at UNC-Chapel Hill has an Evaluation
Unit that is considering offering evaluation help
on a fee-for-service basis - What are your needs and interests re evaluation?
- What are your preferences re methods of
receiving help? (phone consults, in-person,
webinars, workshops, etc.) - If there was an hourly rate travel expenses
attached to these services, do you believe your
organization would use this service in the next 6
months? The next 12 months?
40Evaluation Resources
- Weiss, C. (1998). Evaluation (2nd ed). Englewood
Cliffs, NJPrentice Hall - Fetterman, Kaftarian Wandersman. (1996).
Empowerment Evaluation Thousand Oaks, CA Sage - Tashakkori Teddlie. (1998). Mixed Methodology.
Thousand Oaks, CA Sage - Windsor, Baranowski, Clark Cutter. (1994).
Evaluation of Health Promotion, Health Education
and Disease Prevention Programs. Mountain View,
CA Mayfield Publishing
41Evaluation Resources
- Green Kreuter (2004). Health Program Planning
(4th ed). New York, NY McGraw Hill Companies. - McKenzie Smeltzer (2001). Planning,
Implementing and Evaluating Health Promotion
Programs (3rd ed). Needham Heights, MA Allyn and
Bacon - Patton, M. (1997). Utilization-Focused Evaluation
(3rd ed). Thousand Oaks, CA Sage - Light, Singer Willett. (1990). By Design
Cambridge MA Harvard University Press. - Steckler Linnan. (2002). Process Evaluation for
Public Health Interventions and Research SF, CA
Jossey-Bass Publishing
42Questions ????
- Contact information
- Laura Linnan, ScD, CHES
- Associate Professor
- Dept Health Behavior Health Education, and
- Director, Evaluation Unit, Center for Health
Promotion and Disease Prevention - UNC Chapel Hill
- Chapel Hill, North Carolina 27599-7440
- linnan_at_email.unc.edu
- (919) 843-8044