Space Grant 20th Year Evaluation - PowerPoint PPT Presentation

About This Presentation
Title:

Space Grant 20th Year Evaluation

Description:

Reviewers are invited or selected by NASA headquarters because ... No. NRs will not be included in the assessment compilations of criteria and impact/results. ... – PowerPoint PPT presentation

Number of Views:21
Avg rating:3.0/5.0
Slides: 51
Provided by: barry1
Category:

less

Transcript and Presenter's Notes

Title: Space Grant 20th Year Evaluation


1
Space Grant 20th Year Evaluation
Reviewer Role
Scoring Rubric
Special Considerations
Summary
PPR Reviewer Training
Reviewer Role
Scoring Rubric
  • Program Performance and Results Report Reviewer
    Training
  • Atlanta, GA
  • October 27th, 2008

Special Considerations
Summary
2
Agenda
Reviewer Role
Scoring Rubric
Special Considerations
Summary
PPR Reviewer Training
Reviewer Role
  • Reviewer Role
  • Scoring Rubric
  • Guiding Principles
  • Rubric Areas
  • Scoring
  • Strengths/Weaknesses
  • Special Considerations
  • Summary

Scoring Rubric
Special Considerations
Summary
3
Reviewers
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Reviewer Role
  • Reviewers are invited or selected by NASA
    headquarters because of the ability to make an
    expert judgment based on available data.
  • Reviewers are...
  • Space Grant Directors
  • NASA Headquarters Personnel
  • Field Center Personnel
  • Former Space Grant Directors
  • Other individuals invited by NASA

4
Reviewers
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Reviewer Role
  • The Reviewer role is...
  • To apply knowledge of Space Grant program to make
    an independent, unbiased assessment of the
    assigned consortia.

5
Reviewers
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Reviewer Role
6
Reviewers
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Reviewer Role
  • Develop a working understanding of the NASA
    Education Outcomes
  • Contribute to the development of the Science,
    Technology, Engineering, and Mathematics (STEM)
    workforce in disciplines needed to achieve NASAs
    strategic goals (Employ and Educate).
  • Attract and retain students in STEM disciplines
    through a progression of educational
    opportunities for students, teachers, and faculty
    (Educate and Engage).
  • Build strategic partnerships and linkages between
    STEM formal and informal education providers that
    promote STEM literacy and awareness of NASAs
    mission (Engage and Inspire).

7
What is a Rubric?
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Scoring Rubric
  • A tool that defines and communicates criteria to
    assess performance.
  • Standardizes assessment in areas where a great
    deal of subjective judgment is required.
  • The reviewer makes a judgment based on the
    outlined criteria.

Definition
Methodology
Categories
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
8
Methodology
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Scoring Rubric
  • An expert panel was identified to develop the
    rubric. The base panel included three Space
    Grant Program content experts and one measurement
    professional.
  • The scoring rubrics are based on and directly
    aligned with the guidelines.
  • Consensus was reached between all panel members
    for the Final Rubric.

Definition
Methodology
Categories
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
9
Scoring Categories
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Scoring Rubric
Definition
Categories (Qualitative Judgment) Scale (Quantitative Judgment)
Missing 0
Poor 2, 1
Good 5, 4, 3
Excellent 7, 6
Not Rated NR
Methodology
Categories
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
10
Sample Rubric
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Scoring Rubric
Definition
Evaluation Topic (e.g. Consortium Management, Higher Education, Research Infrastructure) Evaluation Topic (e.g. Consortium Management, Higher Education, Research Infrastructure) Evaluation Topic (e.g. Consortium Management, Higher Education, Research Infrastructure)
Associated CMIS Data list of specific CMIS data table(s), as appropriate Associated CMIS Data list of specific CMIS data table(s), as appropriate Associated CMIS Data list of specific CMIS data table(s), as appropriate
0 Missing The consortium did not address this required element
1 Poor There is inconclusive evidence indicating that the consortium is meeting the goals of the evaluation topic, and/or, evidence is inconclusive because of contradictions between the data sources.
2 Poor There is inconclusive evidence indicating that the consortium is meeting the goals of the evaluation topic, and/or, evidence is inconclusive because of contradictions between the data sources.
3 Good Evidence indicates that the consortium is meeting the goals of the evaluation topic. There is consistency between the data sources or there are minor inconsistencies.
4 Good Evidence indicates that the consortium is meeting the goals of the evaluation topic. There is consistency between the data sources or there are minor inconsistencies.
5 Good Evidence indicates that the consortium is meeting the goals of the evaluation topic. There is consistency between the data sources or there are minor inconsistencies.
6 Excellent There is conclusive evidence indicating that the consortium is excelling at meeting the goals of the evaluation topic. The evidence is conclusive because of the consistency between all data sources.
7 Excellent There is conclusive evidence indicating that the consortium is excelling at meeting the goals of the evaluation topic. The evidence is conclusive because of the consistency between all data sources.
Methodology
Categories
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
In the consortium specific rubrics, the option
NR is available and represents No Rating.
This means that there were no consortium specific
elements
11
Guiding Principles
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Scoring Rubric
  • Five Guiding Principles
  • Alignment
  • Rigor
  • Context
  • Consistency
  • Results

Definition
Methodology
Categories
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
12
Guiding Principles
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Scoring Rubric
  • Alignment
  • The PPR Report and data demonstrate alignment
    with the Legislation, Program Objectives, and
    NASA programmatic guidance.
  • The Reviewer judges how well the consortium
    delineates the state needs and aligns its
    programs with the Space Grant legislation,
    national program objectives, and NASA
    programmatic guidance

Definition
Methodology
Categories
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
13
Guiding Principles
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Scoring Rubric
  • Rigor
  • The PPR Report articulates its purpose, SMART
    goals and objectives. It articulates a clear
    understanding of what the consortium was trying
    to accomplish and how its activities will be
    assessed.
  • The Reviewer judges how well the consortium
    articulates its purpose, goals and objectives,
    and its assessment and evaluation plans.

Definition
Methodology
Categories
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
14
Guiding Principles
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Scoring Rubric
  • Context
  • Context refers to having an understanding of the
    resources the consortium dedicates to an area.
  • Context also refers to understanding the level of
    resources a consortium has based on its grant
    type (Page 20 and 21 of the PPR Guidelines)
  • The Reviewer judges how well the consortium
    justifies the portion of its resources allocated
    to each program element.

Definition
Methodology
Categories
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
15
Guiding Principles
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Scoring Rubric
  • Consistency
  • The CMIS data, where appropriate, validate the
    results reported in the PPR Report. Significant
    inconsistencies might indicate that PPR Report
    statements are questionable.
  • The Reviewer judges the degree of consistency
    between the PPR Report analysis and the CMIS
    data.

Definition
Methodology
Categories
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
16
Guiding Principles
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Scoring Rubric
  • Results
  • The PPR Report and CMIS data give evidence that
    the consortium is making important achievements.
    The consortium is able to demonstrate tangible
    results.
  • The Reviewer judges the results achieved relative
    to the resources allocated to each program
    element.

Definition
Methodology
Categories
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
17
Guiding Principles
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Scoring Rubric
  • The Guiding Principles create a Foundation for
    each reviewer.
  • This foundation enables the reviewer to make
    consortium specific judgments that are
    independent of other consortia.

Definition
Methodology
Categories
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
18
Guiding Principles
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Scoring Rubric
Definition
Methodology
Categories
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
17
19
Rubric Areas
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Scoring Rubric
  • The Rubric is designed with the same format as
    the Program Performance and Results Report.
  • Each element of the PPR Report is unique.
    Because of this uniqueness, a rubric is
    customized for each element.

Definition
Methodology
Categories
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
20
Rubric Types
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Scoring Rubric
  • Each programmatic element has three rubric types
  • Description
  • Core Criteria (The number of criteria vary by
    outcome)
  • Impact/Results or Evidence of Success

Definition
Methodology
Categories
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
21
Rubric Areas
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Scoring Rubric
  • Executive Summary and Consortium Impact
  • Foreword
  • Consortium Management
  • Description
  • Core Criteria
  • Strategic Plan, Consortium Structure/Network
    (Internal), Diversity, Consortium Operations,
    Resource Management, Collaborations and
    Partnerships Outside the Consortium
  • Impact/Results

Definition
Methodology
Categories
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
22
Rubric Areas
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Scoring Rubric
  • NASA Education Outcome I
  • Fellowships/Scholarship Program
  • Research Infrastructure
  • Higher Education
  • NASA Education Outcome I National Program
    Emphases
  • Diversity
  • Workforce Development
  • Longitudinal Tracking
  • Minority Serving Institutions

Definition
Methodology
Categories
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
23
Rubric Areas
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Scoring Rubric
  • NASA Education Outcome 2
  • Precollege Programs
  • NASA Education Outcome 3
  • Public Service Program

Definition
Methodology
Categories
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
24
Scoring Process
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Scoring Rubric
Definition
  • Review the rubric for the section of the PPR
    Report being assessed.
  • Read PPR Report section being assessed.
  • Consider CMIS Data and other data sources
    associated with the section being assessed.
  • Using rubric, make qualitative judgment on
    whether or not the consortium is excellent,
    good, or poor.
  • After a qualitative judgment is made on the level
    of the consortium, make a quantitative judgment
    on what integer score to assign to the consortium
    within the level.
  • Close the loop by re-assessing your rating
    considering the qualitative and quantitative
    judgments. This is the italicized statement
    within each rubric qualitative area.

Methodology
Categories
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
25
Scoring Process
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Scoring Rubric
Definition
Methodology
Categories
Sample Rubric
1. Qualitative Judgment
Guiding Principles
Rubric Types/Areas
Scoring Process
3. Close the loop
2. Quantitative Judgment
26
Comments
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Special Considerations
  • Statement Guidelines
  • Maintain Self-Anonymity
  • Avoid Referencing Individuals by Name
  • State Complete Thoughts
  • Make Specific, Concise Comments
  • Maintain Objectivity in Positive and Negative
    Comments

Comments
Data
Expertise
Not Rated
Demographics
Grant Types
Concurrence
27
Data
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Special Considerations
  • CMIS Data may be a Starting Point
  • The CMIS Data may not be representative of all
    data that are presented in the PPR Report.
  • A consortium may cite data that are outside the
    realm of the variables included in the CMIS
    database. These data should be considered in
    addition to any available CMIS data.

Comments
Data
Expertise
Not Rated
Demographics
Grant Types
Concurrence
28
Reviewer Expertise
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Special Considerations
  • Poor or Good? Good or Excellent?
  • It is possible that a consortium in any PPR
    Report area being judged has characteristics of
    poor, good, and/or excellent performance.
  • The expertise of the reviewer is the deciding
    factor in these cases. The reviewer makes a
    judgment based on the preponderance of the
    available evidence of whether the consortium is
    excellent, good, or poor.

Comments
Data
Expertise
Not Rated
Demographics
Grant Types
Concurrence
29
Not Rated
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Special Considerations
  • NR?
  • It is possible that the consortium specific
    elements were not a focus of the consortium. As
    noted in the PPR Report Guidelines, the
    consortium is to specifically state in the
    description if an element was not applicable
  • If the Description provides an explicit
    statement that an element was not a focus, the
    consortium specific rubric will be rated as NR.

Comments
Data
Expertise
Not Rated
Demographics
Grant Types
Concurrence
30
Not Rated
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Special Considerations
  • Is a Consortium Evaluation Harmed by NRs?
  • No. NRs will not be included in the assessment
    compilations of criteria and impact/results.

Comments
Data
Expertise
Not Rated
Demographics
Grant Types
Concurrence
31
Demographics
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Special Considerations
  • Impacts Can Differ Based on State Demographics.
  • The demographics of the state may make it appear
    that the impact a consortium is having is
    insufficient based on the amount of resources
    dedicated to the area.
  • Refer to the PPR Report Foreword to review the
    described consortium landscape
  • If a reviewer is from a state with demographics
    much different than the consortium being
    reviewed, the reviewer should utilize his/her
    expertise but not apply an unfair bias against a
    consortium. (This refers to the context guiding
    principle).

Comments
Data
Expertise
Not Rated
Demographics
Grant Types
Concurrence
32
Grant Types
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Special Considerations
  • The PPR Report Guidelines (page 20-21) outline
    the Space Grant Types.
  • Designated
  • Program Grant
  • Capability Enhancement
  • An in-depth understanding of the grant types is
    required so that a consortiums PPR receives a
    fair review (This refers to the context guiding
    principle).

Comments
Data
Expertise
Not Rated
Demographics
Grant Types
Concurrence
33
Consortium Concurrence
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Special Considerations
  • The reviewer provides no rating related to
    concurrence
  • The Executive Panel will review this requirement

Comments
Data
Expertise
Not Rated
Demographics
Grant Types
Concurrence
34
Summary
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Summary
  • The Guiding Principles create a foundation for
    the reviewers.
  • Use of the rubric standardizes scoring for the
    reviewers.
  • Scoring
  • Qualitative (Excellent, Good, Poor, Incomplete)
  • Quantitative 7-6 (Excellent), 5-3 (Good), 2-1
    (Poor), 0 (Incomplete)
  • Reviewers are the experts invited or selected to
    use their knowledge as a basis to make judgments.

Summary
Comments
Application
Site Review
35
Activity Comment Evaluation
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Summary
  • The following slides contain actual reviewer
    comments from the 15th year evaluation
  • Consider the guidelines reviewed earlier and
    judge if the comments are appropriate or
    inappropriate

Summary
Comments
Application
Site Review
36
Activity Comment Evaluation
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Summary
  • Effective Comments
  • The translation of science.nasa.gov into Spanish
    provides on-going impact to the Hispanic
    community in STATE and around the world.
    Excellent examples of collaboration with NASA
    Center. Very impressive impact through
    pre-college efforts -- not only bringing the
    Program to STATE, but the design and oversight of
    statewide professional development. This clearly
    demonstrates alignment and coordination with the
    state systemic reform efforts.
  • While the purpose is clear, the description was
    lacking a discussion of measurable objectives
    with clearly defined metrics. The description was
    lacking a discussion of assessment and evaluation
    plan. According to the CMIS data, there has not
    been an underrepresented minority student award
    since 1998. In fact, according to CMIS, thats
    the only underrepresented minority student in
    five years. Student participation research and
    mentoring with field centers and industry is not
    as conclusive as it could be. The discussion is a
    bit too general and appears to center around
    outreach activities.

Summary
Comments
Application
Site Review
37
Activity Comment Evaluation
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Summary
  • There is not much analysis of what the needs are
    and how the consortium is organizing its
    resources to best address those. I would
    recommend that the director should convene a
    planning group in his state, including the
    principals and one or two outside persons, and go
    through the planning process.

Summary
Comments
Application
Site Review
38
Activity Comment Evaluation
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Summary
  • There is not much analysis of what the needs are
    and how the consortium is organizing its
    resources to best address those. (Appropriate
    comment) I would recommend that the director
    should convene a planning group in his state,
    including the principals and one or two outside
    persons, and go through the planning process.
    (Inappropriate comment-it is not the reviewers
    role to make recommendations)

Summary
Comments
Application
Site Review
39
Activity Comment Evaluation
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Summary
  • The strategic implementation plan is clearly
    derived from the National programs strategic
    plan promotes a variety of activities and is
    effectively working to meet the needs of its
    citizens.
  • Strategic objectives clearly derived from
    National priorities. Evidence of analysis of
    state needs.
  • Very complete.

Summary
Comments
Application
Site Review
40
Activity Comment Evaluation
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Summary
  • The strategic implementation plan is clearly
    derived from the National programs strategic
    plan promotes a variety of activities and is
    effectively working to meet the needs of its
    citizens. (Appropriate comment states why area
    is a strength)
  • Strategic objectives clearly derived from
    National priorities. (Appropriate comment)
    Evidence of analysis of state needs.
    (Inappropriate comment does not provide a
    qualitative assessment)
  • Very complete. (Inappropriate comment thats
    it?)

Summary
Comments
Application
Site Review
41
Activity Comment Evaluation
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Summary
  • Both are appropriate and comprehensive comments
  • The translation of science.nasa.gov into Spanish
    provides on-going impact to the Hispanic
    community in STATE and around the world.
    Excellent examples of collaboration with NASA
    Center. Very impressive impact through
    pre-college efforts -- not only bringing the
    Program to STATE, but the design and oversight of
    statewide professional development. This clearly
    demonstrates alignment and coordination with the
    state systemic reform efforts.
  • While the purpose is clear, the description was
    lacking a discussion of measurable objectives
    with clearly defined metrics. The description was
    lacking a discussion of assessment and evaluation
    plan. According to the CMIS data, there has not
    been an underrepresented minority student award
    since 1998. In fact, according to CMIS, thats
    the only underrepresented minority student in
    five years. Student participation research and
    mentoring with field centers and industry is not
    as conclusive as it could be. The discussion is a
    bit too general and appears to center around
    outreach activities.

Summary
Comments
Application
Site Review
42
Activity Rubric Application
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Summary
  • NASA Ties rubric from 15th Year Evaluation

Summary
Comments
NASA TIES Relationships that have been established with and Enterprises for the purposes of implementation, coordination, communication, or dissemination NASA TIES Relationships that have been established with and Enterprises for the purposes of implementation, coordination, communication, or dissemination NASA TIES Relationships that have been established with and Enterprises for the purposes of implementation, coordination, communication, or dissemination
ASSOCIATED CMIS DATA Program Summary Statistics Reports 5 Year Averages and 5 Year Cumulative. Also Fellowship and Scholarship Award Recipient Demographics, Research Participants, and Higher Education Participants ASSOCIATED CMIS DATA Program Summary Statistics Reports 5 Year Averages and 5 Year Cumulative. Also Fellowship and Scholarship Award Recipient Demographics, Research Participants, and Higher Education Participants ASSOCIATED CMIS DATA Program Summary Statistics Reports 5 Year Averages and 5 Year Cumulative. Also Fellowship and Scholarship Award Recipient Demographics, Research Participants, and Higher Education Participants
1 Poor There is inconclusive evidence of existing relationships with and Enterprises. Plans to create relationships are not evident. If there is evidence of existing relationships, these relationships are disjointed or inconsistent and have no apparent goals. Evidence does not indicate a synthesis of the five-year evaluation period. Evidence is inconclusive because of contradictions between the data sources.
2 Poor There is inconclusive evidence of existing relationships with and Enterprises. Plans to create relationships are not evident. If there is evidence of existing relationships, these relationships are disjointed or inconsistent and have no apparent goals. Evidence does not indicate a synthesis of the five-year evaluation period. Evidence is inconclusive because of contradictions between the data sources.
3 Good There is evidence of existing relationships with and Enterprises. There is evidence that the relationships were established to assist the consortium in meeting program goals. Evidence indicates a synthesis of the five-year evaluation period. There is consistency between the data sources or there are minor inconsistencies
4 Good There is evidence of existing relationships with and Enterprises. There is evidence that the relationships were established to assist the consortium in meeting program goals. Evidence indicates a synthesis of the five-year evaluation period. There is consistency between the data sources or there are minor inconsistencies
5 Good There is evidence of existing relationships with and Enterprises. There is evidence that the relationships were established to assist the consortium in meeting program goals. Evidence indicates a synthesis of the five-year evaluation period. There is consistency between the data sources or there are minor inconsistencies
6 Excellent There is conclusive evidence of formalized, existing relationships with Enterprises. There is evidence that the relationships have developed into a partnership between the consortium and the NASA Centers and Enterprises that facilitates meeting consortium goals. There is evidence of products, processes, publications, or other accomplishments as a result of these relationships. Evidence indicates a synthesis of the five-year evaluation period that analyzes trends of the consortium. Evidence is conclusive because of the consistency between all data sources.
7 Excellent There is conclusive evidence of formalized, existing relationships with Enterprises. There is evidence that the relationships have developed into a partnership between the consortium and the NASA Centers and Enterprises that facilitates meeting consortium goals. There is evidence of products, processes, publications, or other accomplishments as a result of these relationships. Evidence indicates a synthesis of the five-year evaluation period that analyzes trends of the consortium. Evidence is conclusive because of the consistency between all data sources.
Application
Site Review
43
Activity Rubric Application
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Summary
  • This consortium was rated as Excellent by all
    reviewers for this submission (Potential
    identifying information removed)
  • NASA Ties Strong ties exist between NASA
    Centers. The Consortium works with NASA
    through the Undergraduate Student Research
    Program (USRP). An employee from each Center,
    generally in the University Affairs Office, is
    assigned to work with staff as Center
    Coordinator for USRP. This relationship is
    strengthened throughout the program cycle as
    staff work closely with Center Coordinators on
    the application review and selection process,
    program marketing efforts, student placement and
    evaluation process formally became a.. member in
    July 2003 and serve on our Advisory Council. A
    partnership exists with In this effort, we
    also work The Consortia has funded a position
    We continue our relationship by funding one or
    two students each year. Additionally, we work
    with experiments through two universities.
    NASA supports Consortium projects and
    supported an project. We manage the Program
    for NASA and theEnterprise. Our ties to are
    strengthened through other joint educational
    projects. provides a administrative
    coordinator slot for NASA . was a supporter of
    the Experiment Program for which we sponsored
    educators. Our working network with NASA Centers
    continues to expand as our program grows.

Summary
Comments
Application
Site Review
44
Activity Rubric Application
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Summary
  • Why Excellent?
  • NASA Ties Strong ties exist between NASA
    Centers. The Consortium works with NASA
    through the Undergraduate Student Research
    Program (USRP). An employee from each Center,
    generally in the University Affairs Office, is
    assigned to work with staff as Center
    Coordinator for USRP. This relationship is
    strengthened throughout the program cycle as
    staff work closely with Center Coordinators on
    the application review and selection process,
    program marketing efforts, student placement and
    evaluation process formally became a.. member in
    July 2003 and serve on our Advisory Council. A
    partnership exists with In this effort, we
    also work The Consortia has funded a position
    We continue our relationship by funding one or
    two students each year. Additionally, we work
    with experiments through two universities.
    NASA supports Consortium projects and
    supported an project. We manage the Program
    for NASA and theEnterprise. Our ties to are
    strengthened through other joint educational
    projects. provides a administrative
    coordinator slot for NASA . was a supporter of
    the Experiment Program for which we sponsored
    educators. Our working network with NASA Centers
    continues to expand as our program grows.

Summary
Comments
               
Award Levels and Amounts Award Levels and Amounts Award Levels and Amounts Award Levels and Amounts Award Levels and Amounts Award Levels and Amounts Award Levels and Amounts Award Levels and Amounts
Awardees 1998 1999 2000 2001 2002 Total 5 Yr. Average
Total Awards 65 69 65 66 53 318 64
Average Award Amount 10,312 12,195 12,764 13,156 14,375 62,802 12,560
Application
Site Review
6 Excellent There is conclusive evidence of formalized, existing relationships with Enterprises. There is evidence that the relationships have developed into a partnership between the consortium and the NASA Centers and Enterprises that facilitates meeting consortium goals. There is evidence of products, processes, publications, or other accomplishments as a result of these relationships. Evidence indicates a synthesis of the five-year evaluation period that analyzes trends of the consortium. Evidence is conclusive because of the consistency between all data sources.
7 Excellent There is conclusive evidence of formalized, existing relationships with Enterprises. There is evidence that the relationships have developed into a partnership between the consortium and the NASA Centers and Enterprises that facilitates meeting consortium goals. There is evidence of products, processes, publications, or other accomplishments as a result of these relationships. Evidence indicates a synthesis of the five-year evaluation period that analyzes trends of the consortium. Evidence is conclusive because of the consistency between all data sources.
45
Site Review
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Summary
Summary
Log into the review site at https//secure.spacegr
ant.org/20th/review/
Comments
Application
Site Review
Enter your email and password here
44
46
Site Review
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Summary
Summary
Logging in brings you to the review summary page
Comments
Application
This page displays the consortia you will review.
Click on the consortia name to go to enter your
score and comments
Site Review
Scores you have entered and saved will be
displayed. Scores you still need to enter will
be grayed out.
45
47
Site Review
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Summary
Summary
These links allow the reviewer to advance to
other rubric sections
Comments
Application
Site Review
Select Review Summary to return to the summary
page.
Enter your rating by selecting the radio button
46
48
Site Review
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Summary
Summary
Comments
Application
Site Review
You must press Save Page to save your data. If
you return to the review summary or select next
or previous page, your data is not saved.
47
49
Site Review
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Summary
Summary
Comments
Application
Site Review
The Submit All Program Performance and Results
Reviews button is at the bottom of the Review
Summary page. Select this button only when you
have completed all reviews. You close your review
process when you select this button.
48
50
Questions?
Reviewer Role
Scoring Rubric
Special Considerations
Summary
Summary
  • Content related Questions
  • Katherine.M.Pruzan_at_nasa.gov
  • Technical Questions
  • Mark.Fischer_at_spacegrant.org

Summary
Comments
Application
Site Review
Write a Comment
User Comments (0)
About PowerShow.com