Monitoring? Evaluation? Impact Evaluation? Appreciating and Taking Advantages of the Differences - PowerPoint PPT Presentation

1 / 55
About This Presentation
Title:

Monitoring? Evaluation? Impact Evaluation? Appreciating and Taking Advantages of the Differences

Description:

Monitoring? Evaluation? Impact Evaluation? Appreciating and Taking Advantages of the Differences Workshop at the Cairo conference on Impact Evaluation – PowerPoint PPT presentation

Number of Views:249
Avg rating:3.0/5.0

less

Transcript and Presenter's Notes

Title: Monitoring? Evaluation? Impact Evaluation? Appreciating and Taking Advantages of the Differences


1
Monitoring? Evaluation? Impact Evaluation?
Appreciating and Taking Advantages of the
Differences
  • Workshop at the Cairo conference on Impact
    Evaluation
  • 29 March 2009

Burt Perrin La Masque Burt_at_BurtPerrin.com 3077
0 Vissec FRANCE 33 4 67 81
50 11
2
Alternative title
  • Putting the and back in MandE

3
Plan for the workshop
  • Participative approach small group exercises,
    your real-world examples, general discussion
  • Consider differences between monitoring and
    evaluation
  • Strengths and limitations of each
  • Use and misuse of performance indicators
  • How to use monitoring and evaluation approaches
    appropriately and in a complementary fashion
  • What is impact evaluation and where does it fit
    in?

4
What do we mean by Monitoring, and by Evaluation?
5
Monitoring the concept and common definitions
  • Tracking progress in accordance with previously
    identified objectives, indicators, or targets
    (plan vs. reality)
  • RBM, performance measurement, performance
    indicators
  • En français suivi vs. contrôle
  • Some other uses of the term
  • Any ongoing activity involving data collection
    and performance (usually internal, sometimes seen
    as self evaluation)

6
Evaluation some initial aspects
  • Systematic, data based
  • Often can use data from monitoring as one source
    of information
  • Can consider any aspect of a policy, programme,
    project
  • Major focus on assessing the impact of the
    intervention (i.e. attribution, cause)
  • E - valua - tion

7
Frequent status of ME
8
Ideal situation Monitoring and Evaluation
complementary
Monitoring
Evaluation
9
Monitoring and Evaluation
  • Evaluation
  • Generally episodic, often external
  • Can question the rationale and relevance of the
    program and its objectives
  • Can identify unintended as well as planned
    impacts and effects
  • Can address how and why questions
  • Can provide guidance for future directions
  • Can use data from different sources and from a
    wide variety of methods
  • Monitoring
  • Periodic, using data routinely gathered or
    readily obtainable, generally internal
  • Assumes appropriateness of programme, activities,
    objectives, indicators
  • Tracks progress against small number of targets/
    indicators (one at a time)
  • Usually quantitative
  • Cannot indicate causality
  • Difficult to use for impact assessment

10
MONITORING, EVALUATION AND IA
Investments (resources, staff) and activities
Long-term, sustainable changes
Products
Immediate achievements of the project
Inputs
Outputs
Outcomes
Impact
Impact assessment what long-term, sustainable
changes have been produced (e.g. the contribution
towards the elimination of child labour)?
Monitoring what has been invested, done and
produced, and how are we progressing towards the
achievement of the objectives?
Evaluation what occurred and what has been
achieved as a result of the project?
11
Evaluation vs. Research
  • Research
  • Primary objective knowledge generation
  • Evaluation
  • reference to a particular type of situation
  • Utilisation in some form an essential component
  • But evaluation makes use of research
    methodologies

12
Monitoring data quantitative only, or also
qualitative?
  • Some/most guidelines specify quantitative only
  • Some nominally allow qualitative information, but

Indicator Q1 Q2 Q3 Q4 Yr




13
Performance Indicators
A consideration of their limitations and
potential for misuse
  • See, for example
  • Burt Perrin, Effective Use and Misuse of
    Performance Measurement, American Journal of
    Evaluation, Vol. 19, No. 3, pp. 367-369, 1998.
  • Burt Perrin, Performance Measurement Does the
    Reality Match the Rhetoric? American Journal of
    Evaluation, Vol. 20, No. 1, pp. 101-114, 1999.

14
Common flaws, limitations, and misuse of
performance indicators - 1
  • Goal displacement
  • Terms and measures interpreted differently
  • Distorted or inaccurate data
  • Meaningless and irrelevant data
  • Cost shifting vs. cost savings
  • Critical subgroup differences hidden

15
(No Transcript)
16
(No Transcript)
17
(No Transcript)
18
(No Transcript)
19
(No Transcript)
20
Common flaws, limitations, and misuse of
performance indicators -2
  • Do not take into account the larger
    context/complexities
  • Limitations of objective-based approaches to
    evaluation
  • Useless for decision making and resource
    allocations
  • Can result in less focus on innovation,
    improvement and outcomes

21
The process of developing indicators should
include
  • Involvement of stakeholders
  • Development, interpretation and revision of
    indicators
  • Allocation of time and resources to the
    development of indicators
  • Provision of training and expertise
  • Thinking about potential forms of misuse in
    advance
  • Pretesting, testing, review and revision

22
Using indicators appropriately some basic
strategic considerations
  • First, do no harm
  • Meaningful and useful at the grassroots the
    program, staff, local stakeholders
  • NOT linked to budget allocations or managerial
    rewards
  • Use only when makes sense, e.g. Mintzberg,
    Pollitt/OECD
  • Standardised programmes recurrent
    products/services
  • Established programmes with a basis for
    identifying meaningful indicators and targets
  • NOT for tangible individual services
  • NOT for non-tangible ideal services

23
Using indicators appropriately strategic
considerations 2
  • Use indicators as indicators
  • At best, a window vs. reality
  • To raise questions rather than to provide the
    answer
  • Different levels (e.g. input, activities,
    outputs, outcomes where it makes sense)

24
Using indicators appropriately strategic
considerations 3
  • Focus on results vs. busy-ness
  • Performance information vs. performance data
  • Descriptive vs. numerical indicator
  • Performance MANAGEment vs. MEASUREment
  • (original intent diverted from management to
    control)
  • Periodically review overall picture ask if the
    data makes sense, identify questions arising
  • Indicators as part of a broad evaluation strategy

25
Using indicators appropriately operational
considerations
  • Look at subgroup differences
  • Indicators/targets indicating direction vs.
    assessing performance
  • If latter, dont set up programme for failure
  • Dynamic vs. static
  • Never right the first time
  • Constantly reassess validity and meaningfulness
  • Pre-test, pre-test, pre-test
  • Update and revise
  • Provide feedback and assistance as needed

26
Using indicators appropriately - reporting
  • More vs. less information in reports
  • Performance story vs. list of numbers
  • Identify limitations provide qualifications
  • Combine with other information
  • Request/provide feedback

27
Evaluation
28
A strategic approach to evaluation
  • Raison dêtre of evaluation
  • Social betterment
  • Sensemaking
  • More generally, raison dêtre of evaluation
  • To be used!
  • Improved policies, programmes, projects,
    services, thinking

29
Monitoring and Evaluation
  • Monitoring
  • Periodic, using data routinely gathered or
    readily obtainable
  • Assumes appropriateness of programme, activities,
    objectives, indicators
  • Tracks progress against small number of targets/
    indicators (one at a time)
  • Usually quantitative
  • Cannot indicate causality
  • Difficult to use for impact assessment
  • Evaluation
  • Generally episodic
  • Can question the rationale and relevance of the
    program and its objectives
  • Can identify unintended as well as planned
    impacts and effects
  • Can provide guidance for future directions
  • Can address how and why questions
  • Can use data from different sources and from a
    wide variety of methods

30
Future orientation - Dilemma
  • The greatest dilemma of mankind is that all
    knowledge is about past events and all decisions
    about the future.
  • The objective of this planning, long-term and
    imperfect as it may be, is to make reasonably
    sure that, in the future, we may end up
    approximately right instead of exactly wrong.

31
Questions for evaluation
  • Start with the questions
  • Choice of methods to follow
  • How to identify questions
  • Who can use evaluation information?
  • What information can be used? How?
  • Different stakeholders different questions
  • Consider responses to hypothetical findings
  • Develop the theory of change (logic model)

32
The three key evaluation questions
  • Whats happening?
  • (planned and unplanned, little or big at any
    level)
  • Why?
  • So what?

33
Some uses for evaluation
  • Programme improvement
  • Identify new policies, programme directions,
    strategies
  • Programme formation
  • Decision making at all levels
  • Accountability
  • Learning
  • Identification of needs
  • Advocacy
  • Instilling evaluative/questioning culture

34
Different types of evaluation
  • Ex-ante vs. ex-post
  • Process vs. outcome
  • Formative vs. summative
  • Descriptive vs. judgemental
  • Accountability vs. learning (vs. advocacy vs.
    pro-forma)
  • Short-term actions vs. long-term thinking
  • Etc.

35
Results chain
Impact Outcomes Reach Outputs Processes Inputs
36
Intervention logic model
37
Generic logic model (simplified)
38
Generic logic model in context
39
(No Transcript)
40
(No Transcript)
41
(No Transcript)
42
Making evaluation useful - 1
  • Be strategic
  • E.g. start with the big picture identify
    questions arising
  • Focus on priority questions and information
    requirements
  • Consider needs, preferences, of key evaluation
    users
  • Dont be limited to stated/intended effects
  • Dont try to do everything in one evaluation

43
Making evaluation useful - 2
  • Primary focus how evaluation can be relevant and
    useful
  • Bear the beneficiaries in mind
  • Take into account diversity, including differing
    world views, logics, and values
  • Be an (appropriate) advocate
  • Dont be too broad
  • 42
  • Dont be too narrow
  • 42

44
How else can one practice evaluation so that it
is useful?
  • Follow the Golden Rule
  • There are no golden rules. (European
    Commission)
  • Art as much as science
  • Be future oriented
  • Involve stakeholders
  • Use multiple and complementary methods,
    qualitative and quantitative
  • Recognize differences between monitoring and
    evaluation

45
To think about
  • Constructive approach, emphasis on learning vs.
    punishment
  • Good practices (not just problems)
  • Take into account complexity theory, systems
    approach, chaos theory
  • Synthesis, knowledge management
  • Establishing how/if the intervention in fact is
    responsible for results (attribution or cause)

46
Impact evaluation/assessment what does this mean?
  • OECD/DAC definition of impact Positive and
    negative, primary and secondary long-term effects
    produced by a development intervention, directly
    or indirectly, intended or unintended.
  • Development objective Intended impact
    contributing to physical, financial,
    institutional, social, environ-mental, or other
    benefits to a society, community, or group of
    people via one or more development interventions.
  • But beware! Impact and impact assessment
    frequently used in very different ways.

47
Determining attribution some alternative
approaches
  • Experimental/quasi-experimental designs
    (randomisation)
  • Eliminate rival plausible hypotheses
  • Physical (qualitative) causality
  • Theory of change approach
  • reasonable attribution
  • Contribution vs. cause
  • Contribution analysis
  • (simplest approach at needed confidence)

48
Some considerations for meaningful impact
evaluation
  • Need information about inputs and activities as
    well as about outcomes
  • Check, dont assume that what is mandated in
    (Western) capitals is what actually takes place
    sur le terrain
  • Check are data sources really accurate?
  • Dealing with responsiveness a problem or a
    strength?
  • Internal vs. external validity

49
Some questions about impact evaluation
  • What is possible with multiple interventions?
  • Changing situation
  • Strategies/policies vs. projects
  • Time frame?

50
Monitoring and Evaluation in Combination
51
How Monitoring and Evaluation can be complementary
  • Ongoing monitoring
  • Can identify questions, issues for (in-depth)
    evaluation
  • Can provide data for evaluation
  • Evaluation
  • Can identify what should be monitored in the
    future

52
Monitoring vs. Evaluation
  • Start with the purpose and question(s)
  • E.g. control vs. learning/improvement
  • Identify information requirements (for whom, how
    would be used )
  • Articulate the theory of change
  • Use most appropriate method(s) given the above
  • Some form of monitoring approach? and/or
  • Some form of evaluation?
  • Do not use monitoring when evaluation is most
    appropriate and vice versa
  • Consider costs (financial, staff time).
    timeliness
  • Monitoring usually but not always! less
    costly and quicker

53
Mon. and Eval. in combination
  • Multi-method approach to evaluation usually most
    appropriate can include monitoring
  • Generally monitoring most appropriate as part of
    an overall evaluation approach
  • E.g. use evaluation to expand upon the what
    information from monitoring, and to address why
    and so what questions
  • Strategic questions ? strategic methods
  • Seek minimum amount of information that addresses
    the right questions and that will actually be
    used
  • Tell the performance story
  • Take a contribution analysis approach

54
Contribution Analysis (Mayne Using performance
measures sensibly)
  1. Develop the results chain
  2. Assess the existing evidence on results
  3. Assess the alternative explanations
  4. Assemble the performance story
  5. Seek out additional evidence
  6. Revise and strengthen the performance story

55
Conclusion
  • Go forward, monitor and evaluate and help to
    make a difference.
  • Thank you / Merci pour votre participation.

Burt Perrin Burt_at_BurtPerrin.com
Write a Comment
User Comments (0)
About PowerShow.com