Title: Indepth Evaluation of R
1In-depth Evaluation of RD Programs how could
it be accountable?
Symposium on International Comparison of the
Budget Cycle in Research Development and
Innovation Policies Madrid (Spain), 3-4 July
2008 OECD/GOV/PGC/SBO
- Seung Jun Yoo, Ph.D.
- RD Evaluation Center
- KISTEP, KOREA
2Contents
1. Overview of Current Evaluation 2. Architecture
of In-depth Evaluation 3. Procedure of In-depth
Evaluation 4. Evaluation with Accountability 5.
Challenges and Discussion
3Overview of Current Evaluation 1 - All RD
Programs (191) evaluated every year!
- specific evaluation (mainly using checklists)
27 programs - self/meta evaluation 164
programs - in-depth evaluation (4 horizontal
programs (pilot run)) climate change related
RD programs university centers of
excellence RD programs infrastructure
(facility/equipments) RD programs genome
research RD programs
4Overview of Current Evaluation 2 - Efficiency
Effectiveness of Evaluation?
- evaluating 191 programs every year? -
efficiency effectiveness of evaluation itself
is questionable, considering characteristics
of RD programs.. - too much loads of evaluation
to evaluators, program managers and
researcher, etc. - not enough time to prepare
and perform evaluation for all RD programs
and to communicate with stakeholders (? might
yield poor accountability?)
5Architecture of In-depth Evaluation 1 - Main
Players
Decision maker for RD Evaluation and Budget
Allocation
MOSF
NSTC
Evaluation Supporting Groups
KISTEP (Evaluators)
RD programs of each ministry
MIFAFF
MEST
MOE
MIK
MW
NSTC (National Science Technology Council)
6Architecture of In-depth Evaluation 2 -
Evaluation Budget allocation
Evaluation group formed
RD Budget
Survey/Analysis
Programs/Projects implemented
In-depth Evaluation
Input for budget allocation
Programs
Feedback
To (re)plan and/or improve program
7Architecture of In-depth Evaluation 3 - Budget
Process
Ministry Budget Ceiling
Program Budget
5 yr plan
2nd Budget Review with Evaluation Results
1st Budget Review
Ministry of Strategy Finance (MOSF)
Budget Committee of National Assembly
(Dec.)
NSTC
8Procedure of In-depth Evaluation 1 - 7-month
schedule (suggested!)
- Selected by selection committee based on
special issue, etc. (month 0) - In-depth
evaluation procedure for selected program(s) -
month 1 form evaluation group, gather
program(s) data, study target
RD program(s), find major evaluation points -
month 2 develop logic model (with system
dynamics, etc.) - month 3/4 perform in-depth
analysis (relevance, efficiency, effectiveness,
program design delivery,
etc)
9Procedure of In-depth Evaluation 2
- month 5 interview (researchers, program
managers, etc.) - month 6 report interim
evaluation result (MOSF, department(s)) -
month 7 report final evaluation result
recommendations
10Evaluation with Accountability 1 -
Responsibility 1
? balance between quantitative and qualitative
evaluation is important - systematic approach
for qualitative evaluation is challenging -
program goals vs projects implemented vs output
? enough time for evaluation is essential
(7-month schedule) - to achieve the goal of
evaluation with accountability - give enough
time for PM to cope with evaluation process
suitable only for limited of programs to
evaluate
11Evaluation with Accountability 2 -
Responsibility 2
? qualitative assessment is needed to achieve the
purpose of evaluation - measuring simple of
publications and patents? - publications
impact factor (1-2 yrs), citation index (more
than 3 yrs) - patents commercial purpose ?
technology value evaluation - selected
projects with excellent performance
consistent funding is required irregardless of
its program evaluation!
12Evaluation with Accountability 3 - Acceptability
1
? understand well characteristics of program,
sharing with stakeholders - performance
indicators are useful tools to get stakeholders
agreement - researchers, program managers,
MOSF, etc. - to set up an evaluation strategy
and points! - important especially for
acceptability and for improving program delivery
13Evaluation with Accountability 4 - Acceptability
2 (Understand Change!)
? communication with stakeholders - interview
with stakeholders is important to increase
accountability researchers, program
managers, MOSF evaluation strategy would
better to share at the beginning - number of
interviews are also important lack of
understanding evaluation is key inhibitor for
accountability! interviews at major steps
such as evaluation strategy, survey
weak/strong point of the program, report interim
evaluation results, etc.
14Challenges and Discussion 1 - Understand
Change Improve!
- Stakeholders should understand their
program(s) otherwise, rigid and too much
defensive for keeping unchanged - Systematic way
to understand diverse aspects of programs
goal, contents, projects, design delivery,
etc. - Sharing of program information to change
change and improve (to all stakeholders)
15Challenges and Discussion 2 - Scientific,
Socio-economic Interest
- Technology impact evaluation for
socio-economic understanding - Results of
technology level evaluation are also useful
16Challenges and Discussion 3 - Communication ?
Consultation
- Communication among stakeholders
(Ministry/Agency, Researchers, MOSF, KISTEP,
etc.) - For better evaluation practices,
communication should be transformed to a way
of consultation
17Muchas gracias!