Health Program Planning and Evaluation: Overview and Feasibility - PowerPoint PPT Presentation

1 / 32
About This Presentation
Title:

Health Program Planning and Evaluation: Overview and Feasibility

Description:

Articulate key factors that constitute the context ... Describe the historical background of program evaluation. ... 1960's: Judgement, merit and value as focus ... – PowerPoint PPT presentation

Number of Views:2322
Avg rating:3.0/5.0
Slides: 33
Provided by: michel79
Category:

less

Transcript and Presenter's Notes

Title: Health Program Planning and Evaluation: Overview and Feasibility


1
Health Program Planning and Evaluation Overview
and Feasibility
  • CHSC 433
  • Module 1/Chapter 1
  • UIC School of Public Health
  • L. Michele Issel, PhD, RN

2
Learning ObjectivesWhat you ought to be able to
do by the end of this module
  1. Articulate key factors that constitute the
    context in which health programs are developed
    and evaluated.
  2. Describe the historical background of program
    evaluation.
  3. Appreciate evaluation as an aide to program
    decision making.
  4. Describe the major phases of the Planning and
    Evaluation Cycle.
  5. Explain the relationship of the Pyramid to the
    Planning and Evaluation Cycle.

3
Outline of this ppt
  • Terminology
  • Evaluation definitions
  • History and types of evaluation
  • Why evaluate
  • When not to evaluate
  • Standards and principles of evaluation
  • Public health pyramid

4
Terminology(Write your definitions, some in the
text)
  • Aggregate Ecological
    model
  • Enabling services Evaluation
  • Planning Population
  • Population services Program
  • Project Research
  • Service

5
Evaluation is.
  • Systematic application of research procedures to
    assess the conceptualization, design,
    implementation and utility of intervention
    programs (Rossi Freeman)
  • Assessing the quality and effect of programs

6
Evaluation is. (continued)
  • Using research methods, measure the effectiveness
    against a set of standards for the purpose of
    decision making (Carol Weiss)

7
History of Evaluation
  • Began in field of education
  • Strengthened during the 1960s emphasis on social
    programs and determining their effect on society
  • Further strengthened during the 1990s emphasis
    on outcomes measurement and quality improvement

8
Generations of Evaluations
  • 1900s Technical, testing
  • to1960 Descriptive, based on program objectives
  • 1960s Judgement, merit and value as focus
  • 1980 and beyond Negotiation, responsive,
    pluralistic

9
Pluralist Perspectives Led to Diverse Types of
Evaluation
  • Utilization-focused
  • Goal free
  • Theory driven
  • Participatory
  • Outcome-focused
  • Value based

10
Why Evaluate Explicit Reasons
  • Monitor program implementation
  • Determine level of need
  • Make resource allocation decisions
  • Determine effect of local conditions
  • Improve quality of current efforts
  • Generate new knowledge

11
Why Evaluate Implicit Reasons
  • Form of social and organizational control
  • As a symbolic action
  • Delaying and skewing decision
  • Facilitate program termination
  • Incentive for planning

12
Typically, Evaluations Question
  • The need for the program
  • The program conceptualization
  • The program operations
  • The outcomes from the program
  • The cost and efficiency of the program

13
Questions Lead to Foci of Evaluations
  • Activities of the program
  • Inputs and resources used
  • Processes and interactions involved in doing or
    providing the program
  • Outputs as services
  • Impact as change in program participants
  • Outcomes as change in target population toward
    goal achievement
  • Efficiency as results per effort

14
Steps in Conducting Evaluations
  • 1. Assess feasibility of doing evaluation
    (Evaluatability Assessment)
  • 2. Develop evaluation questions
  • 3. Select evaluation design and methods
  • 4. Collect evaluation data
  • 5. Analyze evaluation data
  • 6. Disseminate findings

15
Deciding to Evaluate Depends on
  • Presence of Contraindications
  • Context of evaluation
  • Feasibility

16
Contraindications to Evaluating
  • The program is very popular
  • The program is meaningless but has public support
  • Changes in the program would be expensive or
    dangerous
  • Too costly to do the evaluation
  • The program has no clear orientation

17
Contraindications (continued)
  • There are no questions about the program
  • People can not agree on program objectives
  • Impossible to evaluate because of technical
    problems
  • Program sponsor or staff uncooperative or
    resistant

18
Context of Evaluations
  • Environments
  • Social norms and values
  • Attitudes toward evaluations and evaluators
  • Policies and Politics
  • Program Quality
  • Science
  • About evaluation
  • About health problem, population, intervention

19
Standards and Principles
  • The professional evaluation association has
    established both standards for high quality
    evaluations, and principles for conducting
    evaluations.
  • These are values that guide the practice of
    evaluation.

20
4 Standards for Evaluation
  • The utility standards are intended to ensure that
    an evaluation will serve the information needs of
    intended users.
  • The feasibility standards are intended to ensure
    that an evaluation will be realistic, prudent,
    diplomatic, and frugal

21
Standards
  • The propriety standards are intended to ensure
    that an evaluation will be conducted legally,
    ethically, and with due regard for the welfare of
    those involved in the evaluation, as well as
    those affected by its results.

22
Standards
  • The accuracy standards are intended to ensure
    that an evaluation will reveal and convey
    technically adequate information about the
    features that determine worth or merit of the
    program being evaluated.

23
Principles for Evaluation
  • Guiding Principles for Evaluators A Report from
    the AEA Task Force on Guiding Principles for
    Evaluators
  • http//www.eval.org/EvaluationDocuments/aeaprin6.h
    tml

24
Resulting Principles for Evaluation
  • Given the diversity of interests and employment
    setting represented on the Task Force, it is
    noteworthy that Task Force members reached
    substantial agreement about the following five
    principles. The order of these principles does
    not imply priority among them priority will vary
    by situation and evaluator role.

25
Principle Systematic Inquiry
  • Evaluators conduct systematic, data-based
    inquiries about whatever is being evaluated.

26
Principle Competence
  • Evaluators provide competent performance to
    stakeholders.

27
Principle Integrity/Honesty
  • Evaluators ensure the honesty and integrity of
    the entire evaluation process.

28
Principle Respect for People
  • Evaluators respect the security, dignity and
    self-worth of the respondents, program
    participants, clients, and other stakeholders
    with whom they interact.

29
Principle Responsibilities for General and
Public Welfare
  • Evaluators articulate and take into account the
    diversity of interests and values that may be
    related to the general and public welfare.

30
Good Enough Evaluation
  • Sufficient and acceptable evaluation
  • But the
  • Minimum necessary to provide the answers to only
    the most important questions
  • Using the least amount of effort.
  • NOT the Perfect, Ideal.

31
The Public Health Pyramid
32
Public Health Pyramid
  • Useful for
  • Keeping a current project within the big picture.
  • Remembering to address aggregates and
    populations, not just individuals.
  • Framing analyses of contextual elements that may
    be supporting or hindering program.
Write a Comment
User Comments (0)
About PowerShow.com