Program Evaluation - PowerPoint PPT Presentation

1 / 45
About This Presentation
Title:

Program Evaluation

Description:

Program Evaluation A systematic effort to describe the status of a program Extent to which program objectives achieved Uses of Health Program Evaluation Insight ... – PowerPoint PPT presentation

Number of Views:564
Avg rating:3.0/5.0
Slides: 46
Provided by: Man867
Category:

less

Transcript and Presenter's Notes

Title: Program Evaluation


1
Program Evaluation
  • A systematic effort to describe the status of a
    program
  • Extent to which program objectives achieved

2
Uses of Health Program Evaluation
  • Insight
  • - Needs
  • - Barriers
  • - Activities
  • Improvement
  • - Social mobilization
  • - Inter- sectoral Coordination
  • - Implementation
  • - Client Conveniences
  • Affordability
  • Accessibility
  • Availability
  • - Cost - Benefit
  • Outcomes / Impact
  • - Skills development
  • - Behavioral change
  • - Level of success in achieving objectives
  • - Accountability

3
Types of Evaluation
  • Needs Assessment
  • To identify
  • Goals
  • Products
  • Problems
  • Conditions

4
Types of Evaluation Contd
  • Formative (Process) Evaluation
  • To improve developing or ongoing program
  • Role as helper/advisor/planner
  • Progress in achievements
  • Potential problems/needs for improvements in
  • Program Management
  • Inter-sectoral coordination
  • Social mobilization
  • ? Implementation
  • ? Outcomes

5
Types of Evaluation Contd
  • Summative (Coverage) Evaluation
  • (To help decide ultimate fate)
  • Summary statement about
  • Programs achievements
  • Unanticipated Out comes
  • Comparison with other programs

6
Sample Size
  • Factors
  • - Purpose of study
  • - Population size
  • - Level of precision (sampling error)
  • - The confidence / Risk level
  • - Degree of variability
  • - Appropriate for the analysis

7
Sample Size Contd
  • Strategies
  • - Using a census - small population
  • - Using sample size of a similar study
  • - Using published tables / software
  • - Using formulas

8
Health Program Evaluation - Quantitative Research
Methods
  • Approach
  • Measures the reaction of a great many
  • people to a limited set of questions
  • Comparison and statistical aggregation of
  • the data
  • Broad, generalizable set of findings
  • presented succinctly and parsimoniously.

9
Health Program Evaluation Qualitative Research
Methods
  • Participant Observation
  • Key Informant Interviews
  • Open ended Interview
  • Focus Group Discussions
  • Pile sort

10
Health Program Evaluation - Qualitative Research
Methods Contd
  • Findings
  • - Presented alone / in combination with
    quantitative data
  • - Validity and reliability depends on
    methodological skills, sensitivity,
    integrity of the researchers
  • - Skillful interviewing - more than just asking
    questions.
  • - Content analysis - more than just reading to
    see whats there.
  • - Generate useful and credible findings through
    observation, interviewing and content
    analysis
  • How?
  • - Discipline, knowledge, training, practice,
    creativity, hardwork

11
Data Processing
  • Raw field notes should be corrected edited
    and typed
  • Tape recordings need to be transcribed and
    corrected
  • Texts by field worker should not be changed to
    make it writable or readable

12
Data Reduction
  • Process of selecting, focussing, simplifying,
    abstracting and transforming data from field
    notes and transcripts
  • Researcher retains some data chunks, pulls out
    some and gets an idea of story to tell

13
Analysis Steps
  • Free listing
  • Domain Evolution
  • Coding
  • Tabulation
  • Summarizing

14
Quotable Quotes
  • Give a vivid, meaningful flavor which is far
    more convincing than pages of summarized
    numbers
  • - These should not be distracters
  • - Should not take the reader away from the
    real issues in hand

15
  • This is an unprecedented event where all people
    irrespective of caste, creed and religion take
    part in PPI program on the same day (NID)
    throughout the country
  • Health worker (150) Burdwan
  • He (my husband) told me that everybody is going
    for polio drops. Then why should we be left out ?
    After all, everybody is not a fool
  • Utilizer (1422) Delhi
  • We have not at all immunized our son. My husband
    was very stubborn. He said those who are
    immunized are also getting this disease (polio)
    and whatever happens let it happen. He has not
    allowed me to get the child immunized
  • Non utilizer (630) Hyderabad

16
Data Display
  • This is an organized, compressed assembly of
    information that permits conclusion drawing and
    action
  • Matrices, graphs, charts and networks are used

17
Qualitative Vs Quantitative Research - the
dichotomy
18
Qualitative Vs. Quantitative
  • It is not qualitative Vs. quantitative but
    qualitative and quantitative
  • Qualitative methods are rapid, exploratory and
    hypothesis generating
  • Can be used as Impact evaluation research
  • Allow the researcher to palpate the unique
    cross-cultural features

19
Multi-Centric Evaluation Studies
  • Issues
  • Common understanding of the program
  • Common understanding of aims and
    objectives of evaluation exercise
  • Standardization of research instruments
  • Standardization of protocol implementation at
    various sites
  • Regional variation in program implementation

20
Multi-Centric Evaluation Studies Contd
Steps Cohesive network of partners
Multi-disciplinary team of investigators
Piloting of instruments at different sites
Development of common understanding Training
of research teams Multiple layers of quality
assurance measures
21
Network Dynamics
  • Sustaining the Network
  • During active period
  • Recognized individual efforts to excel
    - Co-opted as extended CCT members
  • Recognized PMC efforts to excel
  • - Made regional centers
  • Communication Channels (phone, fax, e-mail)
  • Pilot of instruments (ownership)
  • During interface
  • IndiaCLEN / INCLEN meetings, workshops
  • Feedback on completed reports

22
Network Dynamics Contd
  • Quality Assurance Measures
  • National orientation workshop (PI s)
  • Regional orientation workshops (PI s, RA
    s)
  • PMCs, Regional Centers
  • Quality checks of interviews, schedules, tapes
  • Central Coordinating Office
  • Random checks of recordings, interviewing
    procedure, transcripts, translations
  • Regional Coordinators, CCT members
  • Site visits / Supervision of FGDs
  • Method triangulation using
  • In-depth interviews, FGDs

23
Quality Assurance Measures
  • Development of Interview Schedules
  • Consistency Checks / VALIDITY
  • Method Triangulation
  • Data Triangulation
  • Data Interpretation
  • Partner Medical Colleges
  • Regional CEUs
  • Central Coordinating Office









24
Capacity Building
  • National Level
  • Leadership transfer to coordinate MI project
  • - Thiruvananthapuram
  • State Level
  • PMCs upgraded to Regional Centers
  • Upgrading of physical facilities
  • Ten investigators INDEPENDENTLY took up
    evaluation of national programs at
    state/district level
  • Network Partners
  • Research - individual/collaborative
  • Resource persons - local/regional/national
  • Extended CCT members

25
Interaction with Program Managers / Policy Makers
  • Program Evaluation A partnership exercise
  • Developing objectives, instruments
  • Dissemination of Findings
  • Support other program related activities

26
Dynamics of Establishing Partnerships with Policy
Makers
Evaluators express their opinions explicitly -
based on evidence gathered - consistent quality
assurance measures - limitations of study
accepted up front - politics of evaluation -
remains a scientific endeavor
27
Dynamics of Establishing Partnerships with Policy
Makers
  • Results to be presented in a manner which
    are perceived as
    VALID, RELIABLE FEASIBLE TO IMPLEMENT
  • Working in a strict TIME SCHEDULE for timely
    fine-tuning of strategies
  • Program Evaluators have to establish CREDIBILITY
    with Program Managers

28
How can Evaluation Data be Used?
  • Program managers
  • Redefining aims objectives
  • Modifying or fine tuning strategies (process)
  • Sustainability (including fatigue factor)
  • Judge the worth (impact)
  • Expense / cost
  • Interaction with other activities

29
How can Evaluation Data be Used?
  • Education
  • Generalisability
  • Unique features (success/failure)
  • Determinants of provider and client behavior

30
IndiaCLEN Program Evaluation Network Activities
31
Studies Completed (1997-2001)
Title Year Funding Source
Pulse Polio Immunization Program (PPI-1) 1997- 98 USAID/INCLEN
Pulse Polio Immunization Program (PPI-2) 1997- 98 USAID/INCLEN
Intensive Pulse Polio Immunization Program (IPPI-3) 2000 USAID/INCLEN
Family Health Awareness Campaign (FHAC-1) 1999 USAID/INCLEN
Barriers in Polio Eradication (Moradabad, UP) 2000 WHO/SEARO
Family Health Awareness Campaign (FHAC-2) 2000 (Coverage Process) USAID/INCLEN
32
Forthcoming Studies
Title Year Funding Source
Integrated Disease Surveillance Program Barriers in Surveillance Activities in three States (pilot-FGDs) 2001 World Bank
Vitamin-A and Iron Folate Supplementation Program 2001- 02 MI/IDRC
Safe Injection Practices (Coverage Process) 2001- 02 World Bank
Evaluation of AFP Surveillance 2002 USAID/INCLEN
Family Health Awareness Campaign (Coverage) 2002 USAID/INCLEN
33
Delhi
Lucknow
Nagpur
?
Chennai
Vellore
Thiruvananthapuram
34
(No Transcript)
35
(No Transcript)
36
Agenda Item No.14- Conduction of Family Health
Awareness Campaign A brief (15 minutes)
presentation on evaluation of FHAC round 2000 was
made by Dr. N.K. Arora, IndiaCLEN, AIIMS. Addl.
Secretary Project Director (NACO) said that the
short comings observed in evaluation of the
campaign should be taken into consideration while
preparing action plans for the next round of FHAC
in the year 2001. After discussion (one hour 15
minutes) with the State Project Directors, it was
decided that
Letter No.T.11014/2/2001-NACO dated 05.07.2001
37
IndiaCLEN Program Evaluation Network
  • VISION
  • Facilitate development
  • and implementation of
  • people friendly, effective

38
IndiaCLEN Program Evaluation NetworkInvestigators
CCT Members N. K. Arora M. Lakshman Kiran Goswami Sneh Rewal R.M. Pandey K. Anand K.K. Ganguly Naveet Wig Leena Sinha S. Vivek Adhish N. Chaudhuri H.K. Kumbnani Thomas Mathew Sandip K. Ray S.L. Chadha Rema Devi K.C. Malhotra R. Sankar Sunita Shanbhag Ballabhgarh S.K. Kapoor Praveen Kumar M.K. Taneja R.C. Agarwal Rohtak A.D. Tiwari B.K. Nagla Mohinder Singh Kangra T.D. Sharma Swaran Lata K.L. Ghai G.L. Jaryal Lucknow R.C. Ahuja Vikas Chandra J.V. Singh A.K. Srivastava Kanpur V.N. Tripathi Joginder Singh R.P. Singh Agra Deoki Nandan S.K. Mishra S.P. Agnihotri Dibrugarh Faruque Ahmed Swapna D. Kakoty Mayashree Konwar Bijit Bhattacharya Mir Shahadat Ali Pranab Jyoti Bhuyan Guwahati Chiranjeeb Kakoty Sajjad Ahmed Alaka Bhattacharyya Imphal E. Yaima Singh T.H. Achouba Singh R.K. Narendra Tiasunup Pongener Umatula Agartala Partha Bhattacharjea Aizwal L. Lalhrekima Jodhpur S.L. Solanki Suman Bhansali Afzal Hakim Y.R. Joshi Kota Raghuveer Singh Gopal Bunkar Hans Raj
39
Jaipur Anurag Sarna Rajesh Jain Hemant Jain Bhopal Sheela S. Bhambal A.K. Upadhyaya R.K.S. Kushwaha U.K. Dubey Bilaspur Vijay Tiwari D.N. Mishra Ajay Gurudiwan Ashok Tiwari Gwalior A.G. Shingeweker A. Shingeweker Berhampur B.C. Das D.M. Satapathy G.S. Patnaik T. Sahu Cuttack S.C. Jena S.K. Sahu K. Misra Sambhalpur O.P. Panigarhi H.P. Acharya S.C. Panda Nagpur A.K. Niswade Sanjay Zodpey Sanjay Deshpande Suresh Ughade Prashant Langewar Mumbai Alka Jadhav Nagaonkar Nitin Deshpande Shubhangi Upadhye Chitra Nayak Vijayawada S. Narasimha Reddy T.S.S. Manidhar A. Rama Prasad C. Usha Rani Hyderabad B.V.N. B. Rao C. Bala Krishna J. Ravi Kumar Tirupati K. Raghava Prasad N.A. Chetty G. Raviprabhu Gulbarga B. Mallikarjun R.R. Rampure B.N. Patil Shreeshail Ghooli Bangalore Shivananda Gopal Premalata Bangalore A.S. Mohammad Lalita Bhatti R.M. Christopher Chennai R. Sathianathan A. Vengatesan R.K. Padmanaban S. Karthikeyan Vellore Kurien Thomas O.C. Abraham Mary Kurien Madurai C. Kamaraj M. Eswaran T. Rajagopal Thiruvananthapuram M. Narendranathan P.S. Indu J. Padmamohan S.M. Nair Kannur Jeesha C. Haran T.P. Mubarack Sani M. Jayakumary Calicut M. Ramla Beegum C.R. Saju N.M. Sebastien
40
Program Evaluations - Relevance to Policy
  • ACADEMIA can play an important role in
    influencing the National Policy
  • - multi disciplinary teams
  • Evaluations are not done in VACUUM, should be
    Policy Relevant
  • - central, state, district level
  • RECOGNIZE Policy Makers Other Stakeholders as
    partners

41
Models of Program Evaluation
  • Goal oriented evaluation
  • Aimed to assess the progress and the
    effectiveness of innovations/ interventions.
  • Decision oriented evaluation
  • Aimed to facilitate intelligent judgements by
    decision makers.
  • Responsive evaluation
  • Aimed to depict program process and the value
    perspectives of key people.
  • Evaluation Research
  • Focused on explaining effects, identifying
    causes of effects, and generating
    generalizations about program effectiveness.
  • Goal free evaluation
  • To assess program effects based on criteria
    apart from the programs own conceptual
    framework, especially on the extent to which real
    client needs are met.
  • Advocacy - adversary evaluation
  • Evaluation should derive from the argumentation
    of contrasting points of view.
  • Utilization - oriented evaluation
  • Structured to maximize the utilization of its
    findings by specific stakeholders and users.

42
Design Effect
  • Ratio of variance with cluster sampling to
    variance with simple random sampling
  • Var simple random sampling p(1 - p)
  • n
  • Var cluster sampling (pi -
    p)2
  • K(k-1)
  • Design effect ?
    (pi - p)2 n
  • k(k-1) p(1-p)

43
Health Program Evaluation - Quantitative Research
Methods
  • Approach
  • - Measures the reaction of a great many
    people to a limited set of questions
  • - Comparison and statistical aggregation of
    the data
  • - Broad, generalizable set of findings
    presented succinctly and parsimoniously.

44
Summary
Qualitative methods aim to make sense of, or
interpret, phenomena in terms of the meanings
people bring to them Qualitative research may
define preliminary questions which can then be
addressed in quantitative studies A good
qualitative study will address a clinical problem
through a clearly formulated question and using
more than one research method (triangulation) Anal
ysis of qualitative data can and should be done
using explicit, systematic, and reproducible
methods
45
Development of Program Objectives (Program
Evaluators)
  • Lessons of success failure
  • Wider application of program strategies
  • Determinants of client behavior
  • Impact on other health systems
  • national international interest in later part
Write a Comment
User Comments (0)
About PowerShow.com