Logic Models and Evaluation - PowerPoint PPT Presentation

1 / 19
About This Presentation
Title:

Logic Models and Evaluation

Description:

University of Wisconsin-Extension, Program Development and Evaluation ... University of Wisconsin-Extension, Program Development and Evaluation. What do you ... – PowerPoint PPT presentation

Number of Views:38
Avg rating:3.0/5.0
Slides: 20
Provided by: prin160
Category:

less

Transcript and Presenter's Notes

Title: Logic Models and Evaluation


1
  • Logic Models and Evaluation

2
Check for Validity of Model
  • What are the assumptions for the model?
  • Are the majority of efforts and resources
    reflected in the model? Is anything missing?
  • Is it feasible?
  • Is the program logic sound? Will the parts add up
    to the whole? Are there gaps?
  • Is it meaningful?

3
Programs fail for 3 reasons (Rossi, 1985,
quoted in Shadish, Cook Leviton, 1991)
  • Theory of change failure Program
    conceptualization and design are not capable of
    addressing the problem or generating the desired
    outcomes no matter how well implemented
  • Translation failure Theory is sound, but
    translation into practice or action is not.
  • Implementation failure Failure to properly
    implement programs

4
Logic Models Guide Evaluation
  • Provides the program description that guides our
    evaluation process
  • Helps us match evaluation to the program
  • Helps us know what and when to measure
  • Are you interested in process and/or outcomes?
    Formative or summative?
  • Helps us focus on key, important information
  • Prioritize where will we spend limited
    evaluation resources?
  • What do we really need to know??

University of Wisconsin-Extension, Program
Development and Evaluation
5
Logic model in evaluation
PLANNING Start with the end in mind
How will you know it?
What do you want to know?
EVALUATION check and verify
University of Wisconsin-Extension, Program
Development and Evaluation
6
Evaluation Considerations
  • Stage of program development influences starting
    point, types of evaluation questions, feasibility
  • What you want to know and can reliably measure
    must be appropriate for the programs phase of
    development

7
Logic model and common types of evaluation
Types of evaluation
Process evaluation How is program implemented?
Are activities delivered as intended? Fidelity
of implementation? Are participants being reached
as intended? What are participant reactions?
Needs/asset assessment What are the
characteristics, needs, priorities of target
population? What are potential barriers/facilitato
rs? What is most appropriate to do?
Outcome evaluation To what extent are desired
changes occurring? Goals met? Who is
benefiting/not benefiting? How? What seems to
work? Not work? What are unintended outcomes?
Impact evaluation To what extent can changes
be attributed to the program? What are the net
effects? What are final consequences? Is
program worth resources it costs?
University of Wisconsin-Extension, Program
Development and Evaluation
8
Match evaluation questions to program
INPUTS
ACTIVITIES
OUTCOMES
Program investments
Activities
Participation
Short
Medium
Long-term
Evaluation questions What questions do you
want to answer?e.g., accomplishments at each
step expected causal links unintended
consequences or chains of events set into motion
Indicators What evidence do you need to answer
your questions?
University of Wisconsin-Extension, Program
Development and Evaluation
9
What do you (and others) want to know about the
program?
OUTCOMES
INPUTS
ACTIVITIES
Parents increase knowledge of child dev
Assess parent ed programs
Parents identify appropriate actions to take
Reduced stress
Staff
Parents of 3-10 year olds attend
Parents better understanding their own parenting
style
Improved child-parent relations
Design- deliver evidence-based program of 8
sessions
Money
Parents use effective parenting practices
Partners
Parents gain skills in new ways to parent
Strong families
Research
Facilitate support groups
Parents gain confidence in their abilities
University of Wisconsin-Extension, Program
Development and Evaluation
10
Possible evaluation questions
University of Wisconsin-Extension, Program
Development and Evaluation
11
Identify indicators
  • How will you know it when you see it?
  • What will be the evidence?
  • What are the specific indicators that will be
    measured?
  • Often expressed as ,
  • Can have qualitative indicators as well as
    quantitative indicators

University of Wisconsin-Extension, Program
Development and Evaluation
12
Parent Education Example Evaluation questions,
indicators
Parents increase knowledge of child dev
Staff
Develop parent ed curriculum
Parents better understand their own parenting
style
Money
Reduced stress
Parents identify appropriate actions to take
Parents of 3-10 year olds
Deliver series of 8 interactive sessions
Partners
Improved child-parent relations
Parents gain skills in new ways to parent
Parents use effective parenting practices
Research
Facilitate support groups
Parents gain confidence in their abilities
Strong families
EVALUATION QUESTIONS
To what extent is stress reduced? To what extent
are relations improved?
To what extent did behaviors change? For whom?
Why? What else happened?
To what extent did knowledge and skills increase?
For whom? Why? What else happened?
Who/how many attended/did not attend? Did they
attend all sessions? Supports groups? Were they
satisfied why/why not?
How many sessions were held? How
effectively? , quality of support groups?
What amount of and time were invested?
INDICATORS
, demonstrating increased knowledge/skills Addit
ional outcomes
, demonstrating changes Types of changes
, demonstrating improvements Types of
improvements
Staff used partners
, attended per session Certificate of
completion
Sessions held Quality criteria
University of Wisconsin-Extension, Program
Development and Evaluation
13
Logic model with indicators for Outputs and
Outcomes
Outcomes
Outputs
Educators apply new techniques
Student skills increase
Program implemented
Targeted educators
Educators learn
Number of workshops held Quality of workshops
Number and percent reporting increased learning
amount of increase
Number and percent of educators attending
Number and percent who increase knowledge
Number and percent who practice new techniques
University of Wisconsin-Extension, Program
Development and Evaluation
14
Typical Output Indicators
  • Amount of products, services delivered
  • /type of customers/clients served
  • Timeliness of service provision
  • Accessibility and convenience of service
  • Location hours of operation staff availability
  • Accuracy, adequacy, relevance of assistance
  • Customer satisfaction

E.g. of clients served of consultations of
workshops held of attendees Quality of service
University of Wisconsin-Extension, Program
Development and Evaluation
15
Typical Process Indicators
  • Fidelity of implementation
  • Level of program support
  • Timelines adhered to
  • Level of preparedness

E.g. implementing with fidelity indicating
access to resources and support of services
delivered in timely manner indicating prepared
for practice/application
16
Typical Outcome Indicators
  • Extent of change in
  • Learning
  • Interest
  • Attitudes
  • Motivation
  • Extent of change in
  • Behavior
  • Practice
  • Policy
  • Decision making

E.g. increase in awareness increase in
learning enrolling in STEM courses pursuing
STEM careers
17
Hierarchy of effects
Social Improvements
Source Bennett and Rockwell, 1995, Targeting
Outcomes of Programs
University of Wisconsin-Extension, Program
Development and Evaluation
18
Choosing a Design
  • At what stage is the program?
  • What should be measured?
  • What are your evaluation questions?
  • What are the needs of the prospective users and
    stakeholders of the evaluation?
  • What is feasible given the program, expected
    participants, and the funds available for
    evaluation?

19
Methods of data collection
  • SOURCES OF INFORMATION
  • Existing data
  • Program records, attendance logs, etc
  • Program materials
  • Program participants
  • Others key informants, nonparticipants,
    proponents, critics, staff, collaborators,
    funders, etc.
  • DATA COLLECTION METHODS
  • Survey
  • Interview
  • K S Assessment
  • Observation
  • Focus group
  • Case study
  • Document review
  • Expert or peer review

University of Wisconsin-Extension, Program
Development and Evaluation
Write a Comment
User Comments (0)
About PowerShow.com