Industrial Collaboration and Technology Transfer - PowerPoint PPT Presentation

1 / 45
About This Presentation
Title:

Industrial Collaboration and Technology Transfer

Description:

A Model for Teaching Materials Evaluation: Development and Testing of ... Contact other researchers or peruse research and evaluation reports ... – PowerPoint PPT presentation

Number of Views:59
Avg rating:3.0/5.0
Slides: 46
Provided by: tsay5
Category:

less

Transcript and Presenter's Notes

Title: Industrial Collaboration and Technology Transfer


1
A Model for Teaching Materials Evaluation
Development and Testing of Interactive Computer
Simulations Modules for Undergraduate
Education     Anne E. Donnelly, Adam Denny Randy
Switt, University of Florida

2
Evaluation
  • Evaluation has varied definitions.
  • Accepted definition for NSF Projects Systematic
    investigation of the worth or merit of an
    object
  • -Joint committee on Standards for Education
    Evaluation

Sunal Thomas, UAB, astlc.ua.edu/NSF20Project20
Eval20and20Assessment.ppt
3
How NSF Thinks About Evaluation!
  • A component that is an integral part of the
    research and development process
  • It is not something that comes at the end of the
    project
  • It is a continuous process that begins during
    planning
  • Evaluation is regularly and iteratively performed
    during the project and is completed at the end of
    the project
  • Different questions are appropriate at different
    phases of the project

Sunal Thomas, UAB, astlc.ua.edu/NSF20Project20
Eval20and20Assessment.ppt
4
NSF Expects
  • Include in the final report a separate section on
    the evaluation, its purpose, and what was found
  • In some cases a separate report or interim
    reports may be expected.
  • Grantee will clearly lay out an evaluation plan
    in the proposal
  • Refine the plan after the award

Sunal Thomas, UAB, astlc.ua.edu/NSF20Project20
Eval20and20Assessment.ppt
5
Aerosols Modules
6
Action Steps
Step One Identify and include the evaluator at
the proposal writing stage.  
Step Two Identify stakeholders that should be
included in the evaluation process.
7
Stakeholders
  • Students
  • Faculty
  • Administrators
  • Funding Agency
  • Others

8
Action steps
Step One Identify and include the evaluator at
the proposal writing stage.    Step Two Identify
stakeholders that should be included in the
evaluation process.   Step Three Identify what
question(s) the evaluation will seek to answer.  
9
Action Steps
Step One Identify and include the evaluator at
the proposal writing stage.    Step Two Identify
stakeholders that should be included in the
evaluation process.   Step Three Identify what
question(s) the evaluation will seek to
answer.   Step Four Identifying the type of
evaluation that will be used.  
10
Evaluation Timeline
Education Program
11
Types of Evaluations
  • Front-end
  • What is the problem being addressed?
  • Who are the participants?
  • What do they need/want?

NSF 97-153 User-Friendly Handbook for Mixed
Method Evaluations
12
Types of Evaluations
Formative (includes implementation and process
evaluations)
  • To what extent do the activities and strategies
    match those described in the plan?
  • To what extent were the activities conducted
    according to the proposed timeline?
  • To what extent were the actual costs of project
    implementation in line with initial budget
    expectations?
  • To what extent are the participants moving toward
    the project goals?

NSF 97-153 User-Friendly Handbook for Mixed
Method Evaluations
13
Types of Evaluations
Summative
  • To what extent did the project meet its overall
    goals?
  • Was the project equally effective for all
    participants?
  • What components were the most effective?
  • What significant unintended impacts did the
    project have?

NSF 97-153 User-Friendly Handbook for Mixed
Method Evaluations
14
Action Steps
Step One Identify and include the evaluator at
the proposal writing stage.    Step Two Identify
stakeholders that should be included in the
evaluation process.   Step Three Identify what
question(s) the evaluation will seek to
answer.   Step Four Identifying the type of
evaluation that will be used.   Step Five
Determining what type of data is required.  
15
Data
  • Quantitative
  • Questionnaires
  • Tests
  • Existing databases

16
Data
  • Qualitative
  • Observations
  • Interviews
  • Focus Groups

17
Types of Data
  • Cognitive
  • What did they learn?
  • Affective
  • How did they feel?

18
Action Steps
Step One Identify and include the evaluator at
the proposal writing stage.  Step Two Identify
stakeholders that should be included in the
evaluation process. Step Three Identify what
question(s) the evaluation will seek to
answer.  Step Four Identifying the type of
evaluation that will be used. Step Five
Determining what type of data is required.
  Step Six Select the experimental design for
the evaluation.  
19
Research Designs
Non-Experimental
Posttest only
Pretest-Posttest, one group Quasi-experimental

Time series, nonequivalent
control,no randomization Experimental

Pretest-Posttest-Control,random
sample Case Study
Rigor!
20
Undergraduate Course Modules
  • Formative/Summative
  • Qualitative
  • Quantitative
  • Cognitive
  • Affective
  • Pretest-Posttest

21
Action Steps
Step One Identify and include the evaluator at
the proposal writing stage.  Step Two Identify
stakeholders that should be included in the
evaluation process. Step Three Identify what
question(s) the evaluation will seek to
answer. Step Four Identifying the type of
evaluation that will be used. Step Five
Determining what type of data is required. Step
Six Select the experimental design for the
evaluation.   Step Seven Design of the
evaluation program instruments.  
22
Sample Goal All students will see that
engineering is fun
Sample Performance Objective Students will list
three ways to measure particles.
23
Objectives
Students will appreciate.. Students will
understand. Students will learn.
Students will explain Students will
define Students will compare Students will
analyze
MEASUREABLE
24
Reminder!
Institutional Review Board
25
Action Steps
Step One Identify and include the evaluator at
the proposal writing stage.  Step Two Identify
stakeholders that should be included in the
evaluation process. Step Three Identify what
question(s) the evaluation will seek to
answer. Step Four Identifying the type of
evaluation that will be used. Step Five
Determining what type of data is required. Step
Six Select the experimental design for the
evaluation. Step Seven Design of the evaluation
program instruments.   Step Eight
Implementation.
26
Summary
  • An evaluation expert should be included in the
    project team from the beginning
  • An adequate budget (about 10 of the total
    budget) to support the evaluation effort is
    recommended
  • The project team must work together to develop
    the goals of the materials that ultimately become
    part of the pre and post test instruments
  • Formative evaluation should be conducted with a
    variety of stakeholders
  • Feedback derived from the formative evaluation
    instruments must be provided to the materials
    developers to allow for appropriate modifications
  • Summative evaluation of the materials should be
    conducted with appropriate stakeholder groups as
    identified in the initial evaluation steps.

27
Test Results
Significant at the .01 level
28
Evaluation is a Process
Objectives
Improved Programs
29
NSF Undergraduate Opportunities
Advanced Technological Education (ATE) Arctic
Research OpportunitiesCenters of Research
Excellence in Science and Technology (CREST) and
HBCU Research Infrastructure for Science and
Engineering (RISE) Course, Curriculum, and
Laboratory Improvement (CCLI) Developing Global
Scientists and Engineers (International Research
Experiences for Students (IRES) and Doctoral
Dissertation Enhancement Projects (DDEP))
Dynamics of Coupled Natural and Human Systems
Federal Cyber Service Scholarship for Service
Graduate Research Fellowship Program
Historically Black Colleges and Universities
Undergraduate Program Integrative Graduate
Education and Research Traineeship Program
Interdisciplinary Training for Undergraduates in
Biological and Mathematical Sciences
International Research and Education Planning
Visits and Workshops National Science,
Technology, Engineering, and Mathematics
Education Digital Library NSF Scholarships in
Science, Technology, Engineering, and
Mathematics Partnerships for International
Research and Education Presidential Awards for
Excellence in Science, Mathematics and
Engineering Mentoring Research Experiences for
Undergraduates (REU) Research in Disabilities
Education Research in Undergraduate
Institutions Robert Noyce Teacher Scholarship
Program Science, Technology, Engineering, and
Mathematics Talent Expansion Program
Undergraduate Research Collaboratives
30
NSF Graduate Student Opportunities
Arctic Research OpportunitiesCenters of Research
Excellence in Science and Technology (CREST) and
HBCU Research Infrastructure for Science and
Engineering (RISE)Collaboration in Mathematical
GeosciencesDeveloping Global Scientists and
Engineers (International Research Experiences for
Students (IRES) and Doctoral Dissertation
Enhancement Projects (DDEP))Doctoral
Dissertation Improvement Grants in the
Directorate for Biological SciencesDynamics of
Coupled Natural and Human SystemsEast Asia and
Pacific Summer Institutes for U.S. Graduate
StudentsEthics Education in Science and
EngineeringFederal Cyber Service Scholarship
for ServiceGraduate Research Fellowship
ProgramIntegrative Graduate Education and
Research Traineeship ProgramInternational
Research and Education Planning Visits and
WorkshopsNational Science, Technology,
Engineering, and Mathematics Education Digital
LibraryNSF Astronomy and Astrophysics
Postdoctoral FellowshipsNSF Graduate Teaching
Fellows in K-12 EducationPan-American Advanced
Studies Institutes ProgramPartnerships for
International Research and EducationPostdoctoral
Fellowships in Polar Regions ResearchPresidential
Awards for Excellence in Science, Mathematics
and Engineering MentoringUndergraduate Research
Collaboratives
31
Must Have Components
  • Innovative Transformative
  • Integrate Research and Education
  • Broader Impacts Dissemination
  • Evaluation

32
  • Innovative Transformative

Transformative research is ... research driven by
ideas that stand a reasonable chance of radically
changing our understanding of an important
existing scientific concept or leading to the
creation of a new paradigm or field of science.
Such research also is characterized by its
challenge to current understanding or its pathway
to new frontiers. Education is also a potent
source of transformation. NSF launched the
Integrative Graduate Education and Traineeship
Program (IGERT) some years ago in order to
catalyze a cultural change in graduate education.
The idea was to encourage innovative models for
graduate education in an environment of
collaborative, interdisciplinary research. Dr.
Arden L. Bement, Jr., Director, National Science
Foundation, January 4, 2007
33
Review Criteria No. 1
What is the intellectual merit of the proposed
activity?How important is the proposed activity
to advancing knowledge and understanding within
its own field or across different fields? How
well qualified is the proposer (individual or
team) to conduct the project? (If appropriate,
the reviewer will comment on the quality of the
prior work.) To what extent does the proposed
activity suggest and explore creative, original,
or potentially transformative concepts? How well
conceived and organized is the proposed activity?
Is there sufficient access to resources?
34
  • Integrate Research and Education

NSF staff will give careful consideration to the
following in making funding decisions Integratio
n of Research and Education One of the principal
strategies in support of NSF's goals is to foster
integration of research and education through the
programs, projects and activities it supports at
academic and research institutions.
These institutions provide abundant opportunities
where individuals may concurrently assume
responsibilities as researchers, educators, and
students, and where all can engage in joint
efforts that infuse education with the excitement
of discovery and enrich research through the
diversity of learning perspectives.
NSF 08-1 January 2008 Chapter III - NSF Proposal
Processing and Review
35
  • Broader Impacts Dissemination

What are the broader impacts of the proposed
activity?How well does the activity advance
discovery and understanding while promoting
teaching, training, and learning? How well does
the proposed activity broaden the participation
of underrepresented groups (e.g., gender,
ethnicity, disability, geographic, etc .)? To
what extent will it enhance the infrastructure
for research and education, such as facilities,
instrumentation, networks, and partnerships?
Will the results be disseminated broadly to
enhance scientific and technological
understanding? What may be the benefits of the
proposed activity to society?
http//www.nsf.gov/pubs/gpg/broaderimpacts.pdf
36
Broader Impacts Criterion Representative
Activities
  • Criteria Broader Impacts of the Proposed
    Activity
  • Does the activity promote discovery,
    understanding, teaching, training and learning?
  • Does the proposed activity include participants
    of underrepresented groups?
  • Does it enhance the infrastructure for research
    and education
  • Will the results be disseminated broadly to
    enhance scientific and technological
    understandings?
  • What are the benefits of the proposed activity to
    society?

http//www.nsf.gov/pubs/gpg/broaderimpacts.pdf
Sunal Thomas, UAB, astlc.ua.edu/NSF20Project20
Eval20and20Assessment.ppt
37
NSF staff will give careful consideration to the
following in making funding decisions
Integrating Diversity into NSF Programs,
Projects, and Activities Broadening opportunities
and enabling the participation of all citizens,
women and men, underrepresented minorities, and
persons with disabilities, are essential to the
health and vitality of science and engineering.
NSF is committed to this principle of diversity
and deems it central to the programs, projects,
and activities it considers and supports.
http//www.nsf.gov/pubs/policydocs/pappguide/nsf08
_1/gpg_index.jsp
38
  • Evaluation

Not an option!
39
Example 1- ATE
  • Evaluation should demonstrate use in the
    classrooms and changes in practice of
    participating faculty and teachers.
  • Evaluation must include measures of increased
    student learning of content and processes and
    have input from employers.
  • The project's evaluation plan must describe how
    the effectiveness of efforts to recruit
    prospective K-12 teachers, transfer those
    students into four-year teacher preparation
    programs, enhance their understanding of advanced
    technologies used in the workplace, and enhance
    their ability to improve the technological
    literacy of their students will be measured.
  • Include evaluation of the center's products and
    services and their impact on student learning,
    and of the center's impact on employers and on
    the institutions that manage the center

40
Example 2 - REU
Project Evaluation and Reporting. Describe the
plan to measure qualitatively and quantitatively
the success of the project in achieving its
goals, particularly the degree to which students
have learned and their perspectives on science,
engineering, or education research related to
these disciplines have been expanded.
Evaluation may involve periodic measures
throughout the project to ensure that it is
progressing satisfactorily according to the
project plan, and may involve pre-project and
post-project measures aimed at determining the
degree of student learning that has been
achieved. In addition, it is highly desirable to
have a structured means of tracking participating
students beyond graduation, with the aim of
gauging the degree to which the REU Site
experience has been a lasting influence in the
students' career paths. Although not required,
REU Site PIs may wish to engage specialists in
education research (from their organization or
another one) in planning and implementing the
project evaluation.
41
Most Common Strengths - CCLI
Percent
Pimmel Sorby, 2008 http//www.nsf.gov/attachments
/111743/public/Slides_ASEE_06-22-08.pdf
42
Most Common Weaknesses - CCLI
Percent
Pimmel Sorby, 2008 http//www.nsf.gov/attachments
/111743/public/Slides_ASEE_06-22-08.pdf
43
Finding an Evaluator
  • University setting - contact department chairs
    for availability of staff skilled in project
    evaluation
  • Independent contractors department chairs,
    phone book, state departments, private
    foundations (Kellogg Foundation in Michigan), and
    other local colleges and universities will be
    cognizant of available services
  • Contact other researchers or peruse research and
    evaluation reports

Sunal Thomas, UAB, astlc.ua.edu/NSF20Project20
Eval20and20Assessment.ppt
44
Resources
  • NSF User-Friendly Handbook for Project
    Evaluation, http//www.nsf.gov/pubs/2002/nsf02057/
    nsf02057_1.pdf
  • Building Cevaluation Capacity, Guide I and II,
    Campbell Clewell, 2008, http//www.urban.org/url
    .cfm?ID411651renderforprint1
  • Stepping Ahead An Assessment Plan Development
    Guide, Gloria Rogers Jean Sando, Rose-Hulman
    Institute of Technology

45
A GUIDE FOR PROPOSAL WRITING NATIONAL SCIENCE
FOUNDATION DIRECTORATE FOR EDUCATION AND HUMAN
RESOURCES Division of Undergraduate Education
http//www.nsf.gov/pubs/2004/nsf04016/nsf04016.pdf
Write a Comment
User Comments (0)
About PowerShow.com