Introduction: the policy context - PowerPoint PPT Presentation

1 / 17
About This Presentation
Title:

Introduction: the policy context

Description:

Quality of outputs: assessed through a combination of ... Chair: Julia Goodfellow. LECTURE HALL. Chair: Peter Saraga. COUNCIL ROOM. Chair: Janet Finch ... – PowerPoint PPT presentation

Number of Views:23
Avg rating:3.0/5.0
Slides: 18
Provided by: HEF9
Category:

less

Transcript and Presenter's Notes

Title: Introduction: the policy context


1
(No Transcript)
2
The Research Excellence Framework Issues for
discussion 24 March 2009Graeme Rosenberg, REF
Manager
3
The REF framework
Outputs
Impact
Environment
Quality of the research environment assessed
through narrative and indicators
Quality of outputs assessed through a
combination of bibliometrics and expert review
Impact of research assessed through a portfolio
of evidence
Engagement with users assessed through narrative
and indicators
4
Q1. Selection of staff
  • Some concerns expressed about institutional
    selectivity and desire for either
  • All (research active or academic) staff to be
    included
  • OR
  • Selected staff to be assessed, set alongside a
    full staff profile
  • Principle of selective assessment, versus the
    complete picture

5
Q2. Who gets the credit?
  • Institutions
  • Avoids the transfer market effect, but depresses
    an already low turnover rate
  • Assesses of track record rather than future
    potential
  • Rewards development of new talent
  • Practical challenges of attribution
  • Researchers
  • Dynamism but also transfer market effects
  • Assesses future potential rather than track
    record

6
Q3. Bibliometrics top down model
  • Citation indicators are produced by associating
    outputs in WoS/SCOPUS to HEIs through
    address/affiliation data, and to fields based
    on journal categories
  • Implications for the REF
  • All (indexed) outputs included HEIs get the
    credit
  • Outputs not linked to staff
  • How to combine these indicators with expert
    review?
  • Implications for HEIs
  • Potentially very little effort but would HEIs
    insist on checking the data?

7
Q3. Bibliometrics bottom up model
  • For a defined group of staff, papers are
    identified and citation indicators produced
  • Implications for the REF
  • Flexibility to select staff and papers, to fit
    with other aspects of REF
  • Should panels interpret aggregate indicators, or
    use citation rates per paper to inform expert
    review?
  • Implications for institutions
  • Selecting staff
  • Providing publications data (linked to staff and
    to WoS/SCOPUS)

8
Q4. Expert review of outputs
  • Expert review will remain the sole or dominant
    method of assessing outputs in many disciplines
  • Potential for panels to sample outputs
  • Should there be fewer than 4 outputs per person?

9
Q5. Impact
  • Metrics can be informative but cannot capture the
    full range of impacts qualitative approaches
    will be needed
  • There are challenges relating to time-lags,
    attribution and corroboration
  • How best to involve users in the assessment
    process?

10
Q5. Impact model 1
Outputs
Impact
Environment
Portfolio of all types of research
Range of impacts during the assessment period
Quality of environment, sustainability and
engagement
Narrative, indicators and exemplars
Bibliometrics and expert review
Narrative and indicators
Profiled or single point scale?
Profiled against criteria for excellence,
including academic and user significance
11
Q5. Impact model 2
Outputs
Impact
Environment
Portfolio of all types of research
Quality of environment, sustainability and
engagement
Examples of earlier work leading to current impact
Impact statements case studies
Bibliometrics and expert review
Narrative and indicators
Profiled against criteria for impact (while
meeting a threshold for originality and rigour)
Profiled against criteria for academic excellence
12
Q6. Panels Current structure
Main panels
50-60 expert panels
Submissions
13
Q6. Panel structure Possible alternative?
Submissions
14
Q7. Consistency between panels
  • How much variation should there be in terms of
  • Eligibility criteria and outputs per person
    (especially between subjects using bibliometrics
    and expert review)
  • Weightings between the elements
  • Indicators of impact and environment
  • Working methods
  • Anything else?

15
Q8. Data collection
  • In principle, data about staff, PGRs and income
    are available via HESA, but there are
    limitations
  • Staff records do not include names and census
    dates need to be considered
  • PGR and income data are not linked to staff (are
    these more usable if assessed at a broader
    level?)
  • We plan to phase out RAS
  • REF should use more standardised indicators
    across panels, and more structured narratives
    (RA5)

16
Q9. Internal research management
  • Current developments in research information
    systems to meet multiple internal and external
    requirements
  • Scope for rationalising external requirements?
  • Or agree common definitions and formats?
  • What more can REF and institutions do to increase
    internalisation of the REF?

17
Breakout groups
READING ROOM Chair Ian Marshall BURLINGTON
ROOM Chair Julia Goodfellow LECTURE
HALL Chair Peter Saraga COUNCIL ROOM Chair
Janet Finch
Write a Comment
User Comments (0)
About PowerShow.com