Evaluation 201: Measures, Data Collection and Analysis - PowerPoint PPT Presentation

1 / 51
About This Presentation
Title:

Evaluation 201: Measures, Data Collection and Analysis

Description:

Immigrant families will understand the importance of preventive health services ... Regular tracking of health care coverage and preventive service use ... – PowerPoint PPT presentation

Number of Views:29
Avg rating:3.0/5.0
Slides: 52
Provided by: hruch
Category:

less

Transcript and Presenter's Notes

Title: Evaluation 201: Measures, Data Collection and Analysis


1
Evaluation 201 Measures, Data Collection and
Analysis
  • Holly S. Ruch-Ross, ScD

2
The speaker in this session has no relevant
financial relationship with the manufacturer of
any commercial product and/or provider of
commercial services discussed in this CME
activity. The speaker will not discuss or
demonstrate pharmaceuticals and/or medical
devices that are not approved by the FDA and/or
medical or surgical procedures that involve an
unapproved or "off-label" use of an approved
device or pharmaceutical.
3
Presentation Objectives
  • At the completion of the session, participants
    will be able to
  • Identify ways to measure progress on your project
    goals and objectives
  • Select tools and strategies for collecting
    information you need to evaluate your program.
  • Describe data collection, management and analytic
    techniques.

4
Defining Evaluation
  • Evaluation is action research, intended to
    provide information that is useful for
  • Program development and improvement
  • Program replication
  • Resource allocation
  • Policy decisions.

5
The Evaluation Cycle
  • START

Plan program and evaluation
Adjust program Refine evaluation
Implement program and begin to collect
evaluative data
Review data. Are you doing what you planned?
Having the intended effect?
6
Considerations in Planning and Implementing Your
Evaluation
  • Scientific Standards
  • Protection of Human Subjects
  • Client privacy/confidentiality
  • IRB (Institutional Review Board)
  • HIPAA (Health Insurance Portability and
    Accountability Act)
  • Simple, Realistic, Focused
  • Plan evaluation as you plan the intervention
  • Use your logic model as a guide

7
The Logic Model
8
Types of Evaluation
  • Process
  • Is the program being implemented the way it was
    designed?
  • Outcome
  • Is the program having the intended effect?

9
An adequate evaluation plan includes both process
and outcome
  • Talking about your program effectively requires
    information about both what you are doing and
    what difference it is making.

10
Process Evaluation Information Needs
  • Describe the program and implementation, who
    participates in the program, what services are
    received.
  • Information such as number served, patient
    characteristics, number of contacts with a
    program, number of trainings, number of
    referrals, patient satisfaction.

11
Process Documentation
  • On the logic model, activities and outputs list
    much of the important process information that
    needs to be collected
  • Process documentation usually involves counting
    services provided or received
  • Typical process documentation forms include
    program intakes or admission forms, sign-in
    sheets, records of contacts, case notes
  • Often, process documentation involves information
    that programs need to collect in order to provide
    adequate service

12
Outcome Evaluation Information Needs
  • Detect whether the intervention made a
    difference, what changes can be measured
    (knowledge, attitude, behavior, health status,
    incidence, prevalence)
  • Longer term outcomes may need to be assessed
    using shorter term indicators.

13
Keeping Your Outputs, Outcomes and Indicators
Straight
14
Outcomes and Indicators Examples
15
Comparison Information
  • Randomly assigned control group is the gold
    standard, but not always feasible and affordable
  • Consider other possibilities
  • Local comparison group
  • Convenience sample
  • Community, state or national data
  • Absolute standard
  • Change over time (pre-post tests or assessments)

16
Data Collection Methods
17
Data Collection Methods, cont
18
Data Collection Methods, cont
19
Finding the Right Tools
  • Tools need to measure the right construct.
  • Simple
  • Realistic
  • Used consistently
  • In a useful form
  • Tools must be appropriate for the target
    population (in terms of age, culture, language,
    literacy, other issues).
  • Tools must be easy to administer in the setting.

20
Finding Existing Tools
  • Similar programs
  • Professional literature
  • Published measures
  • Internet

21
Existing Tools
  • When existing tools are used, they must be
    readily available, affordable, and supported by
    the author.
  • Ideally, tools should be well-established (valid,
    reliable and standardized).

22
Designing Your Own Tools
  • Adapt an existing tool to be more appropriate for
    target population
  • Review literature
  • Talk to other grantees
  • Talk to others with ideas about what you should
    ask about experts, staff, recipients of
    services
  • Pilot test tools with representatives of the
    target population.

23
Using Qualitative Data in Evaluation
  • Gain insight into feelings, attitudes, opinions
    and motivations.
  • Study selected issues in depth and detail.
  • Gather the broadest response possible without
    predetermined categories.
  • Gain rich information about a small number of
    people and cases.
  • Put a human face on the program.

24
Data Collection
  • The sophistication of data collection should be
    appropriate for the scale of the project.
  • Plan data collection up front, including who,
    what and when.
  • Have a system in place for tracking participants
    (particularly if follow up is planned).
  • Identify the staff person responsible for data
    handling.
  • Consistency
  • Protect participant confidentiality.
  • Do not collect information that you will not use.

25
Data Management
  • Budget for expenses associated with data entry
    and analysis.
  • Have a strategy for data management up front.
    Begin data entry immediately.
  • Remember that data analysis should follow
    directly from the questions you are trying to
    answer about your intervention.
  • Know what comparative information may be
    available.

26
Data Analysis
  • Data may be essentially nominal or numeric.
    Different procedures are used for each.
  • Computer analysis is not essential.
  • Computer analysis may be simpler than you think
    (for example, Excel)
  • Often, programs need primarily descriptive
    analysis percentages, averages, graphs
  • Comparisons may be made among groups within the
    data, between participants and non-participants,
    or over time.
  • Tests of statistical significance are not always
    necessary or appropriate.

27
Data Analysis -Quantitative
1Procedures are similar for more than two groups
28
Data Analysis - Qualitative
  • Data Reduction
  • manageable chunks
  • Data Display
  • Assemble and organize
  • Conclusions/Verification
  • Making meaning

29
Understanding Your Findings
  • Several factors may affect your results
  • History Things that happen in your community
    outside of your project
  • Passage of time (maturation) People naturally
    mature and change over time
  • Selection Who has complete information and who
    is skipped or missed

30
Understanding Your Findings Data Quality
  • Representativeness
  • Completeness
  • Comprehensiveness
  • Cleanliness

31
Understanding Your Findings
  • Satisfied with integrity of the data
  • Findings make sense
  • Findings are consistent
  • Program staff and target population understand
    findings

32
Using Your Findings
  • Improve services
  • Advocate for service population
  • Obtain funding
  • Support replication
  • Market services or organization
  • Promote policy change

33
Presenting Your Findings
  • Provide context, including limitations
  • Simple messages work best
  • Match detail to the audience
  • Use bullet points and visual aids (charts,
    graphs, pictures)

34
Some Times When You May Need Help From an
Evaluator
  • No one on your team is comfortable working with
    databases
  • Your evaluation plan calls for a comparison group
  • Your evaluation questions require sophisticated
    statistical analysis
  • Your program has changed
  • You are having difficulty finding or developing
    tools
  • You feel overwhelmed!

35
Getting Help From an Evaluator
  • Specific evaluation training and applied research
    experience
  • Experience in a human service setting
  • Professional perspective and methodological
    orientation
  • Interpersonal skills
  • Self interest (i.e., can he/she put yours first?!)

36
Healthy Tomorrows Partnership for Children
Program Example
  • Note The Prevention First Program is fictional.

37
Example Sarah and the Prevention First Program
  • Sarah is the program director of Prevention First
  • Large multi-agency collaborative
  • Community has
  • High mobility
  • Low income
  • Limited/no English
  • Various immigrant groups

38
Example Sarah and the Prevention First Program
  • The program intends to
  • bring together the diverse resources and
    expertise present in the collaborative
  • facilitate the use of preventive health care by
    this community
  • increase public awareness of the many free,
    non-emergency health and dental services
    available in the community.

39
Prevention First Goals and Objectives
  • Goals
  • Immigrant families will understand the importance
    of prevention.
  • Immigrant families will use preventive health
    services
  • Objectives
  • Within the first 6 months of the project, we will
    conduct a focus group with immigrant parents to
    explore possible barriers to the use of
    prevention services.
  • By the end of year 1, we will have made
    presentations to staff of at least 4 agencies
    serving immigrant families to promote preventive
    health services and encourage referrals.
  • By the end of year 1, participating immigrant
    families will schedule and complete an increased
    number of well-child visits over base line.

40
Prevention First Logic Model
41
Prevention First Process Documentation
42
Prevention First Outcomes and Indicators
43
Prevention First Tools
44
Prevention First Data Collection Overview
45
Prevention First Analytic Questions
46
Prevention First Data
47
Prevention First Frequency DistributionCountry
of Origin and English Language Skills
48
Prevention First Cross-tabulationEnrollment in
Health Care Coverage by Country of Origin
49
Prevention First GraphParticipating Families
50
Resources
  • AAP Community Pediatrics website evaluation
    resources http//www.aap.org/commpeds/resources/ev
    aluation.html (see handout)
  • AAP web-based evaluation teleconference slides
    and audio at http//www.aap.org/commpeds/resources
    /teleconferences.html
  • AAP Evaluation Guidebook, Part 1 Designing Your
    Evaluation
  • Under Construction AAP Evaluation Guidebook,
    Part 2 Putting Your Evaluation Plan to Work,
    for release in Fall 2008.

51
Lets Talk Evaluation!
  • Questions?
  • Comments?
  • Experiences to Share?
Write a Comment
User Comments (0)
About PowerShow.com