European Research Council - PowerPoint PPT Presentation

1 / 24
About This Presentation
Title:

European Research Council

Description:

CSA application procedure ... CSA Evaluation Procedure. Proposals checked for eligibility (submitted by deadline? ... CSA. evaluation timetable. Closing date 12 ... – PowerPoint PPT presentation

Number of Views:25
Avg rating:3.0/5.0
Slides: 25
Provided by: Gianp4
Category:

less

Transcript and Presenter's Notes

Title: European Research Council


1
The European Research Council
Developing the ERC Monitoring, Assessment and
Evaluation Strategy Anne Mallaband Unit S1, DG
RTD Information day 16 September 2008
2
Context of the ERC Monitoring, Assessment and
Evaluation Strategy (MAE)
  • The Commission obligations to monitor and
    evaluate its funding programmes
  • Financial Regulations and Implementing Rules,
  • the Communication on Evaluation and on Evaluation
    Standards Good Practices
  • the legal texts of FP7 and the Specific
    Programmes
  • These elements are the baseline for the
    development of the ERC MAE strategy
  • In addition, the ERC MAE strategy contains
    elements which take into account the
    specificities of the ERC structure and the
    implementation of the Ideas programme
  • The aim is to generate a broad and integrated
    understanding of the ERCs performance and
    impacts.

3
Aims of the ERC Support actions
  • To support of the ongoing ERC monitoring and
    evaluation work
  • To provide appropriate and timely quantitative
    and qualitative information for assessing
    outcomes and impacts of interventions
  • To contribute to the future strategy and policy
    development of the ERC.

4
Monitoring, Assessment and Evaluation (MAE)
Framework
5
Topics for ERC Support actions (2009)
  • Topics proposed should be relevant and developed
    in the specific context of the ERC
  • The type of activity, scope of intervention and
    geographical coverage should be clearly set out
  • The criteria for evaluation, as set out in the
    Work programme are
  • OBJECTIVES AND IMPACT
  • QUALITY AND EFFECTIVENESS
  • RESOURCES
  • Methodologies should be effective and innovative
  • The budget for 2009 is 2.5 Million

6
Topics for ERC Support actions (2009)
  • Science Management and efficiency
  • Emerging research areas
  • Institutional and individual excellence
  • Changing structures and policies
  • Funding complementarities

7
Science Management and efficiency
  • Indicative Evaluation questions for
    consideration
  • To what extent has the scientific strategy been
    realised in the implementation of the Ideas
    programme?
  • What defines research excellence? Are measures
    such as publications in scholarly journals and
    citation analysis applicable in all domains? What
    is the impact of the development of open access
    for frontier research?
  • Is the peer review system ensuring that frontier
    research and high risk projects are supported?
    How can its effectiveness be measured? What is
    the perception of the different stakeholders in
    the ERC peer review system? Is the peer review
    system sustainable in the EU? What are the
    alternative options to peer review?

8
Emerging research areas
  • Indicative Evaluation questions for
    consideration
  • Which innovative methodologies can be used for
    tracking ongoing novelties in research?
  • To what extent has new scientific knowledge been
    created from ERC funded activity? Can the return
    on investment be measured to give indicative
    "successes" from ERC funded high risk research?
  • What other funding schemes are available for high
    risk research in Europe (e.g. block funding,
    endowments, international prizes etc.) and what
    impact might this have on ERC schemes?
  • To what extent is the ERC reinforcing the
    dynamism and attractiveness of Europe in emerging
    research areas for the best researchers from both
    European and third countries and for industrial
    investment? How can this be measured?
  • Are there research topics / geographical clusters
    of ERC funded research emerging / evident in EU?

9
Institutional and individual excellence
  • Indicative Evaluation questions for
    consideration
  • Is there a sustained impact of the ERC on the
    strategies of leading EU institutions and how can
    this be measured?
  • Has the attractiveness of Europe for the best
    researchers from both European and third
    countries been realised (retain, recruit,
    repatriate) and sustained?
  • To what extent has EU competition regarding
    research quality had in terms of institutional
    and/or individual excellence?
  • What is the impact of a Europe-wide competitive
    funding structure on the quality of research
    undertaken? (e.g. Is there evidence to suggest
    that research institutions have taken strategic
    decisions in concentrating resources in certain
    niche disciplines and /or building up critical
    mass?)

10
Changing structures and policies
  • Indicative Evaluation questions for
    consideration
  • How can changes in research structures as a
    result of the ERC be measured? (situation before
    vs. situation to date prioritisation,
    institutional rules regulating research careers
    and practices, evaluation capacity, programming
    methodology)
  • Is there evidence for "clusters of excellence" /
    research strengths and specialisations across the
    EU and how will this impact on national and EU
    research policies?
  • To what extent has the ERC contributed to
    structuring effects of the European Research Area?

11
Funding complementarities
  • Indicative Evaluation questions for
    consideration
  • To what extent have the ERC funding schemes
    complemented and added value to national
    programmes?
  • To what extent have relevant interactions between
    ERC and other funding bodies (including the US or
    Research Foundations) developed?
  • What research synergies and complementarities are
    evident between the Ideas programme and other
    Research Framework Programme activities?

12
Methodologies and Design of the ERC Support
actions
  • Are the methodologies proposed effective and
    innovative?
  • Are the approaches for data collection and
    analysis and issues relating to data storage and
    integration addressed?
  • Are the counterfactual, intervention and control
    groups considered when appropriate?
  • Does the methodology draw on global best
    practice?
  • Does the consortium have the necessary knowledge,
    competence and experience for the activities?

13
ERC Support action Time planning
  • Closing date 12 November 2008
  • Proposals sent to reviewers end November 2008
  • Panel meets early February 2009
  • Notification to applicants late February 2009

14
ERC Co-ordination and Support Actions Proposal
preparation and application procedures
Jane Shiel Call Co-ordinator
15
CSA application procedure
  • Proposals submitted using the Electronic Proposal
    Submission Service (EPSS) only.
  • Must pre-register proposal to obtain access
  • Deadline 12 November but allow plenty of time to
    pre-register and submit proposal. Deadline
    enforced. No exceptions.
  • Follow instructions regarding formats, fonts,
    spacing and respect page limits.
  • Where to find proposal submission instructions
    http//cordis.europa.eu/fp7

16
CSA application procedure
  • Application forms comprise two parts
  • PART A administrative information, contact
    details, proposal summary, funding requested.
  • PART B Provides details of the work to be
    carried out. In form of a template with a list of
    headings to be followed by applicant. Structure
    corresponds to evaluation criteria. Submitted as
    PDF file.
  • NB working language of the panel is English.

17
CSA application procedure Contents of Part A
  • Section A1 gives a snapshot of the proposal and
    is filled in by the Co-ordinator
  • A2 filled in by participant(s) (not
    subcontractors) and includes contact details for
    each.
  • A3 concerns the Budget and is filled in by the
    Co-ordinator.

18
CSA application procedureContents of Part B
  • PART B
  • Template provides structure
  • Cover Page
  • Table of contents
  • Proposal description
  • Objectives and impacts
  • Quality and Effectiveness
  • Resources
  • Tables in annex
  • Work package list
  • Deliverables list
  • Work package description
  • Summary of staff effort
  • Milestones
  • Ethical issues text table (where relevant)

19
CSA application procedureEPSS the Co-ordinator
  • EPSS
  • Co-ordinator uses EPSS to
  • Resigister interest in submitting proposal
  • Set up and modify consortium by adding or
    removing participants
  • Complete Part A concerning proposal in general
    and own administrative details
  • Dowload part B template and when it is ready
    upload the finished Part B
  • Submit the complete proposal Parts A and B

20
CSA application procedureEPSS the partners
  • Partners in the proposal use EPSS to
  • Complete their own sections of A forms
    (participant details)
  • Download the Part B template in order to assist
    the co-ordinator in preparing the proposal
    (partners cannot submit the proposal!)
  • View the whole proposal

21
CSA application procedurePART B
  • Important!
  • The information provided in part B should be
    sufficiently comprehensive to allow peer
    reviewers to assess the scientific excellence of
    the proposal according to the evaluation
    criteria.
  • Respect limits. Evaluators will be instructed to
    ignore additional material.
  • Make sure proposal is actually submitted. Submit
    button starts the process but must read error
    messages if any. These block submission until
    rectified.
  • May modify proposal until deadline but not after.

22
CSA Evaluation Procedure
  • Proposals checked for eligibility (submitted by
    deadline? is complete?, scope? )
  • One stage review
  • Peer Review Panel (up to 8 expert reviewers)
  • Selected by the DIS on basis of excellence
  • Subject to rules on confidentiality and conflict
    of interest

23
CSA Evaluation Criteria
  • Three criteria
  • Objectives and impact
  • In line with requirements of the Work Programme?
  • Will project have a substantial impact on the
    ERC strategic objectives?
  • Quality and Effectiveness
  • Do the evaluators estimate that the proposed
    methodology will be effective in reaching the
    goals of the project?
  • Resources
  • Personnel, experience of team members, equipment
    etc appropriate, effective and justified?
  • All marked out of five. Panel score may not be
    average score. Panel will discuss and reach a
    consensus.
  • Reviewers will also decide whether there are
    ethical or security issues which need to be
    addressed.

24
CSA evaluation timetable
  • Closing date 12 November 2008
  • Proposals sent to reviewers end November 2008
  • Panel meets early February 2009
  • Notification to applicants late February 2009
Write a Comment
User Comments (0)
About PowerShow.com