Title: Evaluation 201: Measures, Data Collection and Analysis
1Evaluation 201 Measures, Data Collection and
Analysis
2The speaker in this session has no relevant
financial relationship with the manufacturer of
any commercial product and/or provider of
commercial services discussed in this CME
activity. The speaker will not discuss or
demonstrate pharmaceuticals and/or medical
devices that are not approved by the FDA and/or
medical or surgical procedures that involve an
unapproved or "off-label" use of an approved
device or pharmaceutical.
3Presentation Objectives
- At the completion of the session, participants
will be able to - Identify ways to measure progress on your project
goals and objectives - Select tools and strategies for collecting
information you need to evaluate your program. - Describe data collection, management and analytic
techniques.
4Defining Evaluation
- Evaluation is action research, intended to
provide information that is useful for - Program development and improvement
- Program replication
- Resource allocation
- Policy decisions.
5The Evaluation Cycle
Plan program and evaluation
Adjust program Refine evaluation
Implement program and begin to collect
evaluative data
Review data. Are you doing what you planned?
Having the intended effect?
6Considerations in Planning and Implementing Your
Evaluation
- Scientific Standards
- Protection of Human Subjects
- Client privacy/confidentiality
- IRB (Institutional Review Board)
- HIPAA (Health Insurance Portability and
Accountability Act) - Simple, Realistic, Focused
- Plan evaluation as you plan the intervention
- Use your logic model as a guide
7The Logic Model
8Types of Evaluation
- Process
- Is the program being implemented the way it was
designed?
- Outcome
- Is the program having the intended effect?
9An adequate evaluation plan includes both process
and outcome
- Talking about your program effectively requires
information about both what you are doing and
what difference it is making.
10Process Evaluation Information Needs
- Describe the program and implementation, who
participates in the program, what services are
received. - Information such as number served, patient
characteristics, number of contacts with a
program, number of trainings, number of
referrals, patient satisfaction.
11Process Documentation
- On the logic model, activities and outputs list
much of the important process information that
needs to be collected - Process documentation usually involves counting
services provided or received - Typical process documentation forms include
program intakes or admission forms, sign-in
sheets, records of contacts, case notes - Often, process documentation involves information
that programs need to collect in order to provide
adequate service
12Outcome Evaluation Information Needs
- Detect whether the intervention made a
difference, what changes can be measured
(knowledge, attitude, behavior, health status,
incidence, prevalence) - Longer term outcomes may need to be assessed
using shorter term indicators.
13Keeping Your Outputs, Outcomes and Indicators
Straight
14Outcomes and Indicators Examples
15Comparison Information
- Randomly assigned control group is the gold
standard, but not always feasible and affordable - Consider other possibilities
- Local comparison group
- Convenience sample
- Community, state or national data
- Absolute standard
- Change over time (pre-post tests or assessments)
16Data Collection Methods
17Data Collection Methods, cont
18Data Collection Methods, cont
19Finding the Right Tools
- Tools need to measure the right construct.
- Simple
- Realistic
- Used consistently
- In a useful form
- Tools must be appropriate for the target
population (in terms of age, culture, language,
literacy, other issues). - Tools must be easy to administer in the setting.
20Finding Existing Tools
- Similar programs
- Professional literature
- Published measures
- Internet
21Existing Tools
- When existing tools are used, they must be
readily available, affordable, and supported by
the author. - Ideally, tools should be well-established (valid,
reliable and standardized).
22Designing Your Own Tools
- Adapt an existing tool to be more appropriate for
target population - Review literature
- Talk to other grantees
- Talk to others with ideas about what you should
ask about experts, staff, recipients of
services - Pilot test tools with representatives of the
target population.
23Using Qualitative Data in Evaluation
- Gain insight into feelings, attitudes, opinions
and motivations. - Study selected issues in depth and detail.
- Gather the broadest response possible without
predetermined categories. - Gain rich information about a small number of
people and cases. - Put a human face on the program.
24Data Collection
- The sophistication of data collection should be
appropriate for the scale of the project. - Plan data collection up front, including who,
what and when. - Have a system in place for tracking participants
(particularly if follow up is planned). - Identify the staff person responsible for data
handling. - Consistency
- Protect participant confidentiality.
- Do not collect information that you will not use.
25Data Management
- Budget for expenses associated with data entry
and analysis. - Have a strategy for data management up front.
Begin data entry immediately. - Remember that data analysis should follow
directly from the questions you are trying to
answer about your intervention. - Know what comparative information may be
available.
26Data Analysis
- Data may be essentially nominal or numeric.
Different procedures are used for each. - Computer analysis is not essential.
- Computer analysis may be simpler than you think
(for example, Excel) - Often, programs need primarily descriptive
analysis percentages, averages, graphs - Comparisons may be made among groups within the
data, between participants and non-participants,
or over time. - Tests of statistical significance are not always
necessary or appropriate.
27Data Analysis -Quantitative
1Procedures are similar for more than two groups
28Data Analysis - Qualitative
- Data Reduction
- manageable chunks
- Data Display
- Assemble and organize
- Conclusions/Verification
- Making meaning
29Understanding Your Findings
- Several factors may affect your results
- History Things that happen in your community
outside of your project - Passage of time (maturation) People naturally
mature and change over time - Selection Who has complete information and who
is skipped or missed
30Understanding Your Findings Data Quality
- Representativeness
- Completeness
- Comprehensiveness
- Cleanliness
31Understanding Your Findings
- Satisfied with integrity of the data
- Findings make sense
- Findings are consistent
- Program staff and target population understand
findings
32Using Your Findings
- Improve services
- Advocate for service population
- Obtain funding
- Support replication
- Market services or organization
- Promote policy change
33Presenting Your Findings
- Provide context, including limitations
- Simple messages work best
- Match detail to the audience
- Use bullet points and visual aids (charts,
graphs, pictures)
34Some Times When You May Need Help From an
Evaluator
- No one on your team is comfortable working with
databases - Your evaluation plan calls for a comparison group
- Your evaluation questions require sophisticated
statistical analysis - Your program has changed
- You are having difficulty finding or developing
tools - You feel overwhelmed!
35Getting Help From an Evaluator
- Specific evaluation training and applied research
experience - Experience in a human service setting
- Professional perspective and methodological
orientation - Interpersonal skills
- Self interest (i.e., can he/she put yours first?!)
36Healthy Tomorrows Partnership for Children
Program Example
- Note The Prevention First Program is fictional.
37Example Sarah and the Prevention First Program
- Sarah is the program director of Prevention First
- Large multi-agency collaborative
- Community has
- High mobility
- Low income
- Limited/no English
- Various immigrant groups
38Example Sarah and the Prevention First Program
- The program intends to
- bring together the diverse resources and
expertise present in the collaborative - facilitate the use of preventive health care by
this community - increase public awareness of the many free,
non-emergency health and dental services
available in the community.
39Prevention First Goals and Objectives
- Goals
- Immigrant families will understand the importance
of prevention. - Immigrant families will use preventive health
services - Objectives
- Within the first 6 months of the project, we will
conduct a focus group with immigrant parents to
explore possible barriers to the use of
prevention services. - By the end of year 1, we will have made
presentations to staff of at least 4 agencies
serving immigrant families to promote preventive
health services and encourage referrals. - By the end of year 1, participating immigrant
families will schedule and complete an increased
number of well-child visits over base line.
40Prevention First Logic Model
41Prevention First Process Documentation
42Prevention First Outcomes and Indicators
43Prevention First Tools
44Prevention First Data Collection Overview
45Prevention First Analytic Questions
46Prevention First Data
47Prevention First Frequency DistributionCountry
of Origin and English Language Skills
48Prevention First Cross-tabulationEnrollment in
Health Care Coverage by Country of Origin
49Prevention First GraphParticipating Families
50Resources
- AAP Community Pediatrics website evaluation
resources http//www.aap.org/commpeds/resources/ev
aluation.html (see handout) - AAP web-based evaluation teleconference slides
and audio at http//www.aap.org/commpeds/resources
/teleconferences.html - AAP Evaluation Guidebook, Part 1 Designing Your
Evaluation - Under Construction AAP Evaluation Guidebook,
Part 2 Putting Your Evaluation Plan to Work,
for release in Fall 2008.
51Lets Talk Evaluation!
- Questions?
-
- Comments?
- Experiences to Share?