Title: Comprehensive evaluation
1 Comprehensive evaluation
-
- Balance between
- Research Quality and Relevance
- (The Dutch Models)
- Jack Spaapen
- Coimbra Group HSIS
- Dublin 19 September 2008
-
2Polynesian Visual Art
3Research impact Framework AHRC
- interactions between research and society
- non-linear approach
- metrics alone not enough
- metrics, and impact assessment, and quality
assessment lt-gt knowledge exchange
4Problems evaluating Humanities / Social
sciences / MIT
- Bibliometrics not adequate when it comes to
evaluating research quality ? bad scores in
evaluation procedures - Current indicators for societal relevance
(patents, contracts) not so useful for
humanities, and other fields - Lack of indicators for important communications
to broader audiences, but new metrics for
socio-cultural studies (NL) - General direction seems to be traditional
metrics only, (Australia, RQF ? ERA, RAE in the
UK, but - Netherlands, other countries, are looking for
more comprehensive methods
5Evaluating research quality under pressure
- Peer review trouble with new developments, MIT
research, socio-economic relevance, referee
fatigue - Bibliometrics main focus in ISI journals
- Lack of indicators for important communications
to broader audiences - General direction still seems to be traditional
metrics only, (Australia, RQF ? ERA, RAE in the
UK, but - Netherlands, other countries, are looking for
more comprehensive methods
6struggle for comprehensive evaluation systems
- Dimension 1 metrics dominated by research
practices of natural and biomedical sciences
inadequate for many fields - Dimension 2 growing necessity to be relevant
for economy and society - Dimension 3 attuning scientific quality and
societal relevance in evaluation - Dimension 4 policy makers want simple metrics
for reallocation purposes -
7Many solutions are tried.
- UK Research Councils, AHRC, ESRC, also debate
about RAE - Australia (RQF)
- France, INRA
- Norway, research councils
- Denmark, radar graph.
- Canada HSSFC focus on impacts and performance)
- HERA
8Development of new evaluation systems
-
- growing tension between policy makers /
government and research community about how to
account for research (criteria, indicators,
metrics, but also too many evaluations,
consequences) - growing tension between so-called scientific
quality and societal relevance -
92 debates
-
- Current National Evaluation System SEP 2003
2009 - ERiC, Evaluating Research in Context
10SEP (2003 -2009)
-
- Self evaluation report by research unit
- review of past performance and forward looks
(SWOT) - Focus in site visit report on 4 criteria
- quality (output, position internationally)
- relevance (to policy, industry and society)
- research management
- accountability
- Evaluation both retrospective and prospective
- the accent is on the latter
-
- External site visits every 6 years
- every three years mid term evaluation
11Humanities, social sciences, many others, are
critical
- Criteria and indicators not geared to humanities,
social sciences, technical disciplines - No instruments to evaluate social relevance
- 2005 Academy councils (Humanities and Social
Sciences) issued a report Judging research on
its merits - 2006 Advisory Council for ST policy Alfa
stralen - 2007 Meta Evaluation Committee Trust but verify
12ERiC-project ? relevance
- Joint effort of the Academy, Research Council,
university association, and others - Support institutions with the evaluation of
societal quality / impact of research - Develop criteria and indicators, a methodology,
for assessment - Suggest how to integrate these methods in new SEP
(2009 2015)
134 common steps identified
- Mission of research group or institute is
starting point of evaluation - Identify productive interactions with social
context industry, policy, society at large - Data gathering focus on research groups
performance in the various social domains,
including stakeholder analysis comprehensive
profile of research group - Feed back and forward look
14ERiC evaluation principles
- Comprehensive evaluation, focus on both
scientific quality and relevance - Contextual identify mission, involve
stakeholders in indicator / benchmarking - Combine quantitative and qualitative data
- Forward looking, focus on improving, learning,
coaching in stead of judging
15example REPP table graph
16example of evaluation of societal quality radar
graph concise format
17example of evaluation of societal quality radar
graph extended format