Title: Outline
1Outline Part I
- Definitions
- Pseudo-evaluation vs. legitimate evaluation
- Formative vs. summative evaluation
- Necessary skills
- Planning an evaluation
- Planning a Formative evaluation
- Planning a Summative evaluation
- Cost benefit analysis
- An experimenting society
- Words of warning
2Definition (1)
- Evaluation is the systematic acquisition and
assessment of information to provide useful
feedback about some object. - -- William Trochim (Cornell University)
3Pseudo-evaluation vs. legitimate evaluation
- Pseudo-evaluations
- Evaluation usually occurs in a political
context. - Legitimate evaluations
- Be careful not to engage in the first type
(Pseudo-evaluation)!
- Be careful not to engage in pseudo-evaluations
- Doing so may facilitate inappropriate decisions
- It will also damage your professional reputation
4Pseudo-evaluations a taxonomy
- Here are some procedures to watch out for (E.A.
Suchman, 1967) - Eyewash emphasis on surface appearances
- Whitewash attempts to cover up known failures
5Pseudo-evaluations a taxonomy
- Submarine political use of evaluation to
destroy a programme - Posture ritualistic evaluation to satisfy a
funding requirement, without real interest in, or
intention to use, its findings - Postponement using the need for evaluation to
delay action
6Legitimate Evaluations Four Criteria
- Here are four criteria to help you recognize a
legitimate evaluation - Utility
- Feasibility
- Propriety
- Technical adequacy
7Four criteria for legitimate evaluations
- 1. Utility will someone be able to use it?
- As Robson says, the purpose of an evaluation is
not to prove, but to improve. (2002, p. 209)
8Four criteria for legitimate evaluations
- 2. Feasibility will you have the resources,
time, and co-operation you need? If not, dont do
the evaluation. - Wont achieve anything useful
- May damage your professional reputation.
- Especially an issue in formative evaluation,
where results may be needed for program planning. - Remember the engineers maxim
- Good, fast, cheap. Pick any two.
9Four criteria for legitimate evaluations
- 3. Propriety only do an evaluation if you can
do it fairly and ethically. - No submarines
- Acceptable outcome measures
- Say no if you believe the course of action has
already been decided on, and a decision maker
just wants cover.
10Four criteria for legitimate evaluations
- 4. Technical adequacy if you are satisfied on
the first three issues, carry out the evaluation
with technical skill and sensitivity. - How can you tell whether you have the technical
skill? - What do you have to think about in planning?
- What are the relevant skills?
- Well consider these issues below
11What to think about in planning
- Reasons for evaluating
- Why is the evaluation being done?
- Who should have access to the information
obtained? - What value will results have?
- Will action be taken?
- Will someone not want results published?
12What to think about in planning
- Interpretation
- Is the nature of the evaluation agreed upon by
those involved? - Outcome measures
- What type of change is good, or bad?
13What to think about in planning
- Subject
- What kind of information do you need?
- Evaluators
- Who will gather the information?
- Who will analyze the data and write the report?
14What to think about in planning
- Methods
- What method is appropriate given the questions?
- Can you develop your method in the time allowed?
- Is your method acceptable to those involved?
(Service providers and consumers.)
15What to think about in planning
- Time
- What time is available? Is this sufficient?
- Permissions and control
- Necessary permissions obtained?
- Is participation voluntary?
- Who decides what goes into the report?
16What to think about in planning
- Use
- Who decides how the evaluation will be used?
- Will those involved (providers, consumers) see
the report in a modifiable draft version? - Is the form of the report appropriate for the
intended audience (style, length, stats)?
17An evaluation culture
- These ideas are based on Donald Campbells (1969)
concept of an experimenting society, and
Trochims related concept of an evaluation
culture - To learn more about Trochims ideas, see
http//www.socialresearchmethods.net/kb/evalcult.p
hp
18An evaluation culture
- An evaluation works and improves because the
culture is - Action-oriented
- Teaching-oriented
- Diverse, inclusive, participatory, responsive
and fundamentally non-hierarchical. - Humble, self-critical
19An evaluation culture
- An evaluation works and improves because the
culture is - Interdisciplinary
- Truth-seeking, forward-Looking
- Ethical, and democratic
20Words of warning
- Keep it simple
- Avoid complex designs and data analysis
- Think defensively
- Anything that can go wrong, will go wrong.
- Try to anticipate potential problems and plan
how you will deal with them.
21Words of warning
- Change will always have sponsors and critics.
- Peoples lives may be radically changed
- On the basis of your findings.
- jobs may be on the line
- careers may be advanced or slowed
- a program may be expanded or cut back
22Words of warning
- There will be many stakeholders politicians,
administrators, deliverers, targets, unions,
taxpayers. - It is unlikely that the interests of all these
groups will coincide.
23Outline Part II
- Formative Summative evaluation defined
- Elements of a Formative evaluation
- Elements of a Summative evaluation
- Evaluation strategies
- Scientific-Experimental Paradigms
- Management-oriented systems models
- Qualitative-Anthropological models
- Participant-oriented models
- Necessary Skills
24Two Types of Evaluation
- Formative evaluation
- Helps in the development of a program or
service. - Summative evaluation
- Assesses the effects and effectiveness of the
program - Covers all effects, not just those intended
25Formative Evaluation - Elements
- Questions about the process being evaluated
- Structured conceptualization
- Logic model
- Process evaluation
- Implementation evaluation
26Formative Evaluation elements
- 1. Structured conceptualization helps
stakeholders define program, targets, and desired
outcomes. - Stakeholders who are they?
- Outcomes how do you plan to measure them?
27Formative Evaluation elements
- 2. A logic model makes explicit the steps that
are expected to produce the desired change. It is
often shown as a flow chart or map. - A good logic model may reveal hidden assumptions
about how intervention will work.
28Formative Evaluation elements
- 3. Process evaluation What alternative
procedures are available for delivery of the
program? - 4. Implementation evaluation Is program being
delivered the way it is supposed to be? Are there
unexpected consequences?
29Summative Evaluation
- Outcome evaluation
- Did program cause demonstrable effects on
predefined outcome measures? - Impact evaluation
- Broader assesses overall effects, intended and
unintended, of a program
30Summative Evaluation
- Cost-benefit analysis
- Questions about efficiency
- Standardizes outcomes in terms of dollar costs
and dollar benefits - Important when you have to choose how to spend
limited amounts of money
31Cost-Benefit Analysis
- To do cost-benefit analysis you need to know (in
addition to program cost) - (a) magnitude of benefits a program produces and
- (b) that the program produced these benefits.
- These things can only be learned through an
experimental design.
32Cost-Benefit Analysis
- Some issues to consider before you do CBA
- Opportunity cost
- Present value of money
- Fairness
- Complexity
33CBA and Opportunity Cost
- CBA expresses values in dollars.
- This reveals opportunity cost if you do X with
your money, you cannot do Y with the same money. - Some values are difficult to express in dollars.
E.g., what is the value of having mail delivery
in rural areas? - How do you express non-market values in dollars?
34CBA Present Value
- CBA works with the Present Value (PV) of money.
- Future outcomes are uncertain.
- Inflation alters value of money e.g., PV of
1m in 50 years at 5 inflation 87,000 .
35CBA Present Value
- 100 of benefit today is worth more in Present
Value than 100 of benefit 5 years from now. - This makes sense, but biases program evaluation
away from long-term outcomes
36CBA Fairness
- CBA compares benefits and costs without regard to
who benefits and who pays costs. Is that fair? Is
it unavoidable? - For example, people who live in the city
subsidize mail delivery to people who live in the
country. Is that fair? CBA doesnt answer that
question.
37CBA Complexity
- Most social problems, and many problems in the
private sector are complex. - They have many interacting causes, so
establishing cause may be difficult. - Any program is likely to make only a small
difference. - But it still makes sense to quantify the value
of a program, to see if we could spend our money
to better effect.
38The relevant skills? (Robson, 2002)
- Writing a proposal
- Clarifying purposes of an evaluation
- Identifying, organizing and working with an
evaluation team - Choosing design data-collection techniques
- Interviewing
- Questionnaire construction and use
39What are the relevant skills?
- Observation
- Management of complex information systems
- Data analysis
- Report-writing
- Encouraging people to use the findings
- Sensitivity to political concerns