Title: Qualitative Methods for Health Program Evaluation
1Qualitative Methods for Health Program Evaluation
- CHSC 433
- Module 5/Chapter 12
- L. Michele Issel, PhD
- UIC School of Public Health
2 - Different kinds of problems require different
types of data. - Patton, 1997
3Objectives
- List the major qualitative designs
- List at least one pro and con for each of the
major qualitative designs - Provide an outline of how qualitative data
analyses are done
4Beyond the Paradigm Debate
- History of science favored quantitative
(empiricism), deductive hypothesis testing,
logical postivism - Current science favors understanding based on
rigorous methods
5Use Qualitative when
- Want to minimize research manipulation by
studying natural field setting. - Program aims at individual outcomes (so when
program aims at common outcomes across
individuals, use quantitative methods).
6Key Characteristics
- The use of non-numeric data, such as narratives,
pictures, music - The use of subjective, experiential, naturalistic
inquiry to explain phenomena - Use of inductive, iterative analysis
- Holistic and contextual concerns
- Pays attention to individuals uniqueness
7Functions of Qualitative Methods (Adapted from
Green Lewis 86)
- 1. Develop and delineate program elements
- 2. Booster power of quantitative designs
- 3. Broad the observational field
- 4. Analyze processes and cases to understand why
or how the program worked - 5. Generate a program or intervention theory
- 6. Use instead of quantitative methods
8Underlying Perspectives
- Phenomenology- experiences and meanings
- Ethnography- culture
- Critical analysis- communication and power
- Grounded Theory- discovery of theory
- Content Analysis-manifest meanings in the written
word
9Perspective --gt Question
- What does it mean to the person?
- What are the norms and values (culture)?
- How has power shaped it ?
- What are the relationships (theory)?
- What themes are in the text?
- Phenomenology
- Ethnography
- Critical analysis
- Grounded theory
- Content analysis
10Major Types of Qualitative Methods
- Participant observation
- Case studies
- In-depth Interviews
- Focus groups
- Open-ended survey questions
11Participant Observation
- Acting as a member of a group, collect data
- Make narrative notes and memos about processes,
events, people observed - Use key informants to verify data analysis
12Case Studies
- Define what is a case (organization, program,
person) - Use variety of types of raw data generated by or
about the case memoranda, observations, surveys,
interviews, etc
13Case Study
Key Benefits for use in planning and evaluation Key Challenges for use in planning and evaluation
Allows for understanding of context as influence on program or participant Allows for understanding of context as influence on program or participant
14In-depth Interviews
- Use open-ended questions with key individuals
(participants, key informants) - Use probes to clarify and explore issues or
topics alluded to by the respondent or earlier
data analysis - Use tape recorder and transcripts of the
interviews
15In-depth Interviews
Key Benefits for use in planning and evaluation Key Challenges for use in planning and evaluation
Provides rich insights into personal thoughts, values, meanings, and attributions Provides rich insights into personal thoughts, values, meanings, and attributions
16Focus Groups
- Carefully selected group of individuals who
participate in guided discussion about a specific
topic - Use a facilitator and a recorder
17Focus Groups
Key Benefits for use in planning and evaluation Key Challenges for use in planning and evaluation
Inexpensive given the amount and type of data, get collective views rather than individual views Inexpensive given the amount and type of data, get collective views rather than individual views
18Observations
- Non-participatory and Participatory techniques
can be used - Need training on what will be observed and how
will record the observation - Data collection methods vary
- Audio-visuals recording
- Field notes
- Logs
19Observations
Key Benefits for use in planning and evaluation Key Challenges for use in planning and evaluation
Can identify sequence of causes and effects, may identify new behaviors or events Can identify sequence of causes and effects, may identify new behaviors or events
20Open-ended survey questions
- Use open-ended question placed at end of
quantitative survey - Unable to use probes for clarification
- Handwriting and spelling can make interpretation
difficult
21Sampling for Naturalistic Inquiry
- Small purposive samples
- Select for a specific characteristic
- Theoretical sampling
- Select based on what ought to matter
- Sample for category saturation
- Select until no new information is gained from
participants
22Data Analysis
- Coding and interpreting the data
- To count or not to count
23Coding Terminology
- Category- classification of concepts in the data
- Dimension- implies continuum
- Property- attributes or characteristics of a
category - Constant comparison- process to develop
categories, involves comparing new with existing
categories - Codable- unit of data to be categorized
24Analysis Procedures
- Identify codable units of data
- Understand the meaning
- Discover categories
- Name categories
- Discover properties and dimensions of the
categories - Generate explanation
25Scientific Rigor Trustworthiness (Lincoln
Guba, 1985)
- Credibility Internal validity
- Transferability External validity
- Dependability Reliability
- Confirmability Objectivity
26Credibility
- Have confidence in the truth of the findings by
- Invest sufficient time, triangulate
- Use outsiders for insights (peer debriefing)
- Refine working hypotheses with negative cases
- Check findings against raw data
- Use participant feedback
27Transferability
- Applicability to other contexts and respondents
- Provide thick (detailed, comprehensive)
descriptions for others to assess possibility of
transferability
28Dependability
- Find same results if repeated the study
- Leave a trail that can be followed so that
others can see the findings are supported by the
data
29Confirmability
- Findings are from the respondents not the
researcher - Leave an Audit Trail (same as for Dependability)
30To count or not to count
- Number of participants who mentioned a category
- Number of times category mentioned throughout the
study - Issues. Neither help with interpretation of
meanings, both can misrepresent the scope of the
sentiment
31From Data to Description
- Categories as typologies are rudiments of a
theory - Category dimensions and properties as essential
- Linkages between categories form theory
32Data Presentation
- Use descriptions of context to show
transferability - Use tables showing category development to show
dependability and confirmation - Use participants words to show confirmation
- Use diagrams of relationships among categories
33Realities of Data Analysis
- Messy, confusing, repetitive
- Iterative category development
- Overwhelming quantities of data
- Conflicting interpretations of data by peers and
participants - Manifest versus implied meanings cloud data
analysis - Investigator biases
34Cost of Data Collection
- Interview time
- Travel to interviewee
- Reading and listening
- Transcription time
- 1 hour interview 3 hours transcribing
- Data analysis time
35Evaluation Caveats
- Integrate with quantitative data
- Use of participant feedback (credibility) as key
to acceptance of findings - Stories are more powerful than numbers and make
the numbers more human
36Qualitative MethodsAcross the Pyramid
Each qualitative method has potential usefulness
for programs at each level of the Pyramid.