Title: Evaluation EvidenceBased Public Health: Improving Practice at the Community Level
1EvaluationEvidence-Based Public
HealthImproving Practice at the Community Level
- Ross Brownson
- Terry Leet
- Beth Baker
2Learning Objectives
- Understand the basic components of program
evaluation. - Describe the differences and unique contributions
of quantitative and qualitative evaluation. - Understand the concepts of measurement validity
and reliability. - Understand the various types of study designs
useful in program evaluation. - Understand the types of data that are appropriate
for answering different evaluation questions. - Understand some of the steps involved in
conducting qualitative evaluations.
3Community Needs Values
Scientific Evidence
Resources
(adapted and modified from Muir Gray)
4Evidence-based community health is a process of
- Engaging stakeholders
- Assessing what influences health, health
behaviors and community health (literature, local
needs, academic theory) - Developing programs based on assessment (science)
- Evaluating process, impact, and outcome
- Learning from our work and sharing it in ways
that are accessible to ALL stakeholders
5Define program evaluation
- The systematic examination and assessment of
features of an initiative and its effects, in
order to produce information that can be used by
those who have an interest in its improvement or
effectiveness. - WHO European working group on health promoting
evaluation. Health promotion evaluation
recommendations to policymakers. Copenhagen
World Health Organization 1998.
6Why evaluate?
- Improve existing programs
- Measure effectiveness
- Demonstrate accountability
- Share effective strategies and lessons learned
- Ensure funding and sustainability
- Evaluation is a tool that can both measure and
contribute to the success of your program.
7Evaluation versus research
- Evaluation
- Controlled by stakeholders
- Flexible design
- Ongoing
- Used to improve programs
- Research
- Controlled by investigator
- Tightly controlled design
- Specific timeframe
- Use to further knowledge
8Do you have
- A research/evaluation person on staff?
- Time and other resources?
- Staff to assist?
- Necessary skills?
From Mattessich, 2003
9What are the most significant challenges you face
in program evaluation?
- Program personnel may be threatened by the
evaluation - Need for personnel involvement vs. objectivity
- Comprehensive evaluation versus nothing at all
- 10 Rule as you design and implement programs
10In program planning when should you begin
planning an evaluation?
11A useful framework
12From CDC, 1999
13Several important questions
How do qualitative and quantitative methods
differ? What are sources of data? How might
program evaluation differ from policy evaluation?
14Logic Model
- PROGRAM PLANNING
- Goal
- Objective
- Activities
- EVALUATION
- Outcome
- Impact
- Process
15Evaluation Framework
Program - instructors? - content? -
methods? - time allotments? - materials
Process
Impact
Evaluation Types
Behavior - knowledge gain? - attitude
change? - habit change? - skill development?
Outcome
Health - mortality? - morbidity? -
disability? - quality of life?
(Adapted from Green et al., 1980)
16Types of Evaluation
- Process evaluation
- Field of Dreams evaluation
- shorter-term feedback on program implementation,
content, methods, participant response,
practitioner response - what is working, what is not working
17Types of Evaluation
- Process evaluation (cont)
- direct extension of action planning in previous
module - uses quantitative or qualitative data
- also called formative evaluation
- data usually involves counts, not rates or ratios
18Process
- FORMATIVE EVALUATION
- Is our program being implemented well? program
activities or action steps
19Process
- What is the role of the staff implementing the
intervention(s) in developing and implementing
the evaluation? - What is the role of the community receiving the
intervention(s) in developing and implementing
the evaluation? - If the staff is from the community do you still
need to get community input? Why or why not?
20Types of Evaluation
- Considerations for process evaluation
- 1. Sources of data
- program data
- 2. Limitations of data (completeness)
- 3. Time frame
- 4. Availability costs
- Example
- number of low income men taking part in a
worksite physical activity program
21Types of Evaluation
- Impact evaluation
- long-term or short-term feedback on knowledge,
attitudes, beliefs, behaviors - uses quantitative or qualitative data
- also called summative evaluation
- probably more realistic endpoints for most public
health programs and policies
22Types of Evaluation
- Considerations for impact evaluation
- 1. Sources of data
- surveillance or program data
- 2. Limitations of data (validity and reliability)
- 3. Time frame
- 4. Availability costs
- Example
- Obesity rates by gender in Michigan
23Types of Evaluation
- Outcome evaluation
- long-term feedback on health status, morbidity,
mortality - uses quantitative data
- also called summative evaluation
- often used in strategic plans
24Types of Evaluation
- Considerations for outcome evaluation
- 1. Sources of data
- routine surveillance data
- 2. Limitations of data (validity and reliability)
- 3. Time frame
- 4. Availability costs
- European health for all database
- Example Heart disease rates by county in Missouri
25Heart disease rates in Missouri, 1994-2003 (age
adjusted)
26Types of evaluation--Outcome
- Considerations for outcome evaluation
- 1. Sources of data (routine surveillance data)
- 2. Limitations of data (validity and reliability)
- 3. Time frame
- 4. Availability costs
27Is this intervention effective?
rate
28Is this intervention effective?
rate
29Program Evaluation Schematic
Study Population
Ineligibility
Eligibility
Participation
No Participation
Randomization ?
Intervention Group
Control Group
Outcome(s)
Outcome(s)
30Evaluation of new smoking policy
Reference Stillman et al. JAMA 19902641565-1569
31 People Smoking
Reference Stillman et al. JAMA 19902641565-1569
32Avg. Daily Cigarette Remnant Counts
Reference Stillman et al. JAMA 19902641565-1569
33Median Nicotine Vapor Concentrations (ug/m3)
Reference Stillman et al. JAMA 19902641565-1569
34 Employees Smoking
35Concepts of validity and reliability and their
importance
36Measurement Issues
- Quality of measures
- High validity
- High reliability
37Measurement Issues
- Evaluation threats
- Validity
- Is the instrument or design measuring exactly
what was intended? - Reliability
- Is it measurement being conducted consistently?
38Measurement Issues
- Validity best available approximation to the
truth - Internal
- the extent of causality (the effects are really
attributable to the program) - External
- the extent of generalizability
- importance??
39Measurement Issues
- Examples
- validity
- self-reported rate of smoking among pregnant
women compared with cotinine validation - reliability
- test-retest data on self-reported smoking rates
among women
40Measurement Issues
- Literature/contacting researchers may show you
accepted methods - Multiple statistical methods available to report
validity and reliability - Evaluation instruments often need community
contouring - participatory methods may prevent use of existing
instruments/questions
41Study/Evaluation Designs
- Some KEYS to solid evaluation
- pre- post-data
- comparison groups
- complete program records
- reliable and valid measures
- proper analytic techniques
42Qualitative data - used for community assessment
and evaluation
- Qualitative helps to define issues/concerns and
strengths from the community perspective - What needs to be done?
- Existing or new data
- Needs, assets, attitudes
- Status of available health promotion
programs/health care facilities - Resources/physical structures in the community
43Qualitative Methods
- Observation
- Audiovisual methods
- Spatial mapping
- Interviews
- Individual
- Group/focus group
From LeCompte Schensul, 1999
44Dissemination- Qualitative and/or quantitative
- Report should include
- purpose of the evaluation
- stakeholders involved
- description of the program
- description of the methodology used
- the evaluation update and/or results and
recommendations - Dont wait until the end of the program
45Dissemination
- Formal or informal
- Written or oral
- Newsletters, internet, community forums
46Productive Use of Evaluation Consultants
- Have a program theory
- Intend to use the results
- Make expectations clear
- Develop good advisory committee
- Consider every step collaborative
- Focus on information needs of users
- Budget enough time money
- Develop standards for communication
- Embrace some ambiguity
From Mattessich, 2003
47Organizational Issues/Summary
- How can I do evaluation when theres so much
real work to do? - Independent (outside) evaluation may be useful
- What to look for in a good evaluation
- Useful to consider qualitative as well as
quantitative methods