Evaluation EvidenceBased Public Health: Improving Practice at the Community Level - PowerPoint PPT Presentation

1 / 47
About This Presentation
Title:

Evaluation EvidenceBased Public Health: Improving Practice at the Community Level

Description:

Understand the basic components of program evaluation. ... Ineligibility. Outcome(s) Intervention Group. Outcome(s) Control Group. Randomization ? ... – PowerPoint PPT presentation

Number of Views:87
Avg rating:3.0/5.0
Slides: 48
Provided by: rcbro
Category:

less

Transcript and Presenter's Notes

Title: Evaluation EvidenceBased Public Health: Improving Practice at the Community Level


1
EvaluationEvidence-Based Public
HealthImproving Practice at the Community Level
  • Ross Brownson
  • Terry Leet
  • Beth Baker

2
Learning Objectives
  • Understand the basic components of program
    evaluation.
  • Describe the differences and unique contributions
    of quantitative and qualitative evaluation.
  • Understand the concepts of measurement validity
    and reliability.
  • Understand the various types of study designs
    useful in program evaluation.
  • Understand the types of data that are appropriate
    for answering different evaluation questions.
  • Understand some of the steps involved in
    conducting qualitative evaluations.

3
Community Needs Values
Scientific Evidence
Resources
(adapted and modified from Muir Gray)
4
Evidence-based community health is a process of
  • Engaging stakeholders
  • Assessing what influences health, health
    behaviors and community health (literature, local
    needs, academic theory)
  • Developing programs based on assessment (science)
  • Evaluating process, impact, and outcome
  • Learning from our work and sharing it in ways
    that are accessible to ALL stakeholders

5
Define program evaluation
  • The systematic examination and assessment of
    features of an initiative and its effects, in
    order to produce information that can be used by
    those who have an interest in its improvement or
    effectiveness.
  • WHO European working group on health promoting
    evaluation. Health promotion evaluation
    recommendations to policymakers. Copenhagen
    World Health Organization 1998.

6
Why evaluate?
  • Improve existing programs
  • Measure effectiveness
  • Demonstrate accountability
  • Share effective strategies and lessons learned
  • Ensure funding and sustainability
  • Evaluation is a tool that can both measure and
    contribute to the success of your program.

7
Evaluation versus research
  • Evaluation
  • Controlled by stakeholders
  • Flexible design
  • Ongoing
  • Used to improve programs
  • Research
  • Controlled by investigator
  • Tightly controlled design
  • Specific timeframe
  • Use to further knowledge

8
Do you have
  • A research/evaluation person on staff?
  • Time and other resources?
  • Staff to assist?
  • Necessary skills?

From Mattessich, 2003
9
What are the most significant challenges you face
in program evaluation?
  • Program personnel may be threatened by the
    evaluation
  • Need for personnel involvement vs. objectivity
  • Comprehensive evaluation versus nothing at all
  • 10 Rule as you design and implement programs

10
In program planning when should you begin
planning an evaluation?
11
A useful framework
12
From CDC, 1999
13
Several important questions
How do qualitative and quantitative methods
differ? What are sources of data? How might
program evaluation differ from policy evaluation?
14
Logic Model
  • PROGRAM PLANNING
  • Goal
  • Objective
  • Activities
  • EVALUATION
  • Outcome
  • Impact
  • Process

15
Evaluation Framework
Program - instructors? - content? -
methods? - time allotments? - materials
Process
Impact
Evaluation Types
Behavior - knowledge gain? - attitude
change? - habit change? - skill development?
Outcome
Health - mortality? - morbidity? -
disability? - quality of life?
(Adapted from Green et al., 1980)
16
Types of Evaluation
  • Process evaluation
  • Field of Dreams evaluation
  • shorter-term feedback on program implementation,
    content, methods, participant response,
    practitioner response
  • what is working, what is not working

17
Types of Evaluation
  • Process evaluation (cont)
  • direct extension of action planning in previous
    module
  • uses quantitative or qualitative data
  • also called formative evaluation
  • data usually involves counts, not rates or ratios

18
Process
  • FORMATIVE EVALUATION
  • Is our program being implemented well? program
    activities or action steps

19
Process
  • What is the role of the staff implementing the
    intervention(s) in developing and implementing
    the evaluation?
  • What is the role of the community receiving the
    intervention(s) in developing and implementing
    the evaluation?
  • If the staff is from the community do you still
    need to get community input? Why or why not?

20
Types of Evaluation
  • Considerations for process evaluation
  • 1. Sources of data
  • program data
  • 2. Limitations of data (completeness)
  • 3. Time frame
  • 4. Availability costs
  • Example
  • number of low income men taking part in a
    worksite physical activity program

21
Types of Evaluation
  • Impact evaluation
  • long-term or short-term feedback on knowledge,
    attitudes, beliefs, behaviors
  • uses quantitative or qualitative data
  • also called summative evaluation
  • probably more realistic endpoints for most public
    health programs and policies

22
Types of Evaluation
  • Considerations for impact evaluation
  • 1. Sources of data
  • surveillance or program data
  • 2. Limitations of data (validity and reliability)
  • 3. Time frame
  • 4. Availability costs
  • Example
  • Obesity rates by gender in Michigan

23
Types of Evaluation
  • Outcome evaluation
  • long-term feedback on health status, morbidity,
    mortality
  • uses quantitative data
  • also called summative evaluation
  • often used in strategic plans

24
Types of Evaluation
  • Considerations for outcome evaluation
  • 1. Sources of data
  • routine surveillance data
  • 2. Limitations of data (validity and reliability)
  • 3. Time frame
  • 4. Availability costs
  • European health for all database
  • Example Heart disease rates by county in Missouri

25
Heart disease rates in Missouri, 1994-2003 (age
adjusted)
26
Types of evaluation--Outcome
  • Considerations for outcome evaluation
  • 1. Sources of data (routine surveillance data)
  • 2. Limitations of data (validity and reliability)
  • 3. Time frame
  • 4. Availability costs

27
Is this intervention effective?
rate
28
Is this intervention effective?
rate
29
Program Evaluation Schematic
Study Population
Ineligibility
Eligibility
Participation
No Participation
Randomization ?
Intervention Group
Control Group
Outcome(s)
Outcome(s)
30
Evaluation of new smoking policy
Reference Stillman et al. JAMA 19902641565-1569
31
People Smoking
Reference Stillman et al. JAMA 19902641565-1569
32
Avg. Daily Cigarette Remnant Counts
Reference Stillman et al. JAMA 19902641565-1569
33
Median Nicotine Vapor Concentrations (ug/m3)
Reference Stillman et al. JAMA 19902641565-1569
34
Employees Smoking
35
Concepts of validity and reliability and their
importance
36
Measurement Issues
  • Quality of measures
  • High validity
  • High reliability

37
Measurement Issues
  • Evaluation threats
  • Validity
  • Is the instrument or design measuring exactly
    what was intended?
  • Reliability
  • Is it measurement being conducted consistently?

38
Measurement Issues
  • Validity best available approximation to the
    truth
  • Internal
  • the extent of causality (the effects are really
    attributable to the program)
  • External
  • the extent of generalizability
  • importance??

39
Measurement Issues
  • Examples
  • validity
  • self-reported rate of smoking among pregnant
    women compared with cotinine validation
  • reliability
  • test-retest data on self-reported smoking rates
    among women

40
Measurement Issues
  • Literature/contacting researchers may show you
    accepted methods
  • Multiple statistical methods available to report
    validity and reliability
  • Evaluation instruments often need community
    contouring
  • participatory methods may prevent use of existing
    instruments/questions

41
Study/Evaluation Designs
  • Some KEYS to solid evaluation
  • pre- post-data
  • comparison groups
  • complete program records
  • reliable and valid measures
  • proper analytic techniques

42
Qualitative data - used for community assessment
and evaluation
  • Qualitative helps to define issues/concerns and
    strengths from the community perspective
  • What needs to be done?
  • Existing or new data
  • Needs, assets, attitudes
  • Status of available health promotion
    programs/health care facilities
  • Resources/physical structures in the community

43
Qualitative Methods
  • Observation
  • Audiovisual methods
  • Spatial mapping
  • Interviews
  • Individual
  • Group/focus group

From LeCompte Schensul, 1999
44
Dissemination- Qualitative and/or quantitative
  • Report should include
  • purpose of the evaluation
  • stakeholders involved
  • description of the program
  • description of the methodology used
  • the evaluation update and/or results and
    recommendations
  • Dont wait until the end of the program

45
Dissemination
  • Formal or informal
  • Written or oral
  • Newsletters, internet, community forums

46
Productive Use of Evaluation Consultants
  • Have a program theory
  • Intend to use the results
  • Make expectations clear
  • Develop good advisory committee
  • Consider every step collaborative
  • Focus on information needs of users
  • Budget enough time money
  • Develop standards for communication
  • Embrace some ambiguity

From Mattessich, 2003
47
Organizational Issues/Summary
  • How can I do evaluation when theres so much
    real work to do?
  • Independent (outside) evaluation may be useful
  • What to look for in a good evaluation
  • Useful to consider qualitative as well as
    quantitative methods
Write a Comment
User Comments (0)
About PowerShow.com