Title: Developing Implementation Evaluation Models
1Developing Implementation Evaluation Models
- To Provide Assistance to the National Science
Foundations
2Catherine Callow-Heusser
- Project Director, Co-PI
- Evaluation Capacity Building Project
- A NSF-FundedResearch, Evaluation, and Technical
Assistance (MSP-RETA) Project
3Goals of NSFs MSP Program
- The Math and Science Partnership (MSP) program is
a major research and development effort that
supports innovative partnerships to improve K-12
student achievement in mathematics and science. - MSP projects are expected to both raise the
achievement levels of all students and
significantly reduce achievement gaps in the
mathematics and science performance of diverse
student populations. - Successful projects serve as models that can be
widely replicated in educational practice to
improve the mathematics and science achievement
of all the Nation's students. - (NSFs MSP RFP http//www.nsf.gov/pubs/2003/nsf03
605/nsf03605.htm)
4MSPs Five Key Characteristics
- Partnership-Driven
- Higher Ed K-12 Others
- Teacher Quality, Quantity, and Diversity
- Challenging Courses and Curricula
- Evidence-Based Design and Outcomes
- Institutional Change and Sustainability
5Math-Science Partnership Program
MSP Funding, Intervention
Student Achievement
6Inside the MSP Black Box
Professional Development Community Involvement
Mentoring Partnerships
Recruitment
Challenging Curriculum Teacher
Retention Teacher Leaders
Universities Tutoring Summer Workshops
Pre-Service Redesign K-12
Scientists/Engineers Business
IncreasedStudent Success in Math Science
MSP Goals and
7 Westat (2003). http//www.mspinfo.com/Source/
Chap9_Evidence_and_Evaluation.asp
8Example from MSP Strategic Plan
- Goal
- To increase student achievement and reduce
achievement gaps in science and mathematics for
all preK-12 students in partner school districts. - Strategies for achieving goal
- Work with districts to develop and implement
strategic plans for improving math and science
achievement and reduce achievement gaps. - Work with districts to develop internal
leadership structures and practicesamong
teacher-leaders, principals, and district
staffto improve teaching of math and science. - Provide well-designed, continuing professional
development to help teachers learn new content
and practices, become more attuned to students
thinking, and use new curriculum materials
aligned with state and national standards.
9Components in the Black Box
MSP Funding, Intervention
Student Achievement
Professional Development
Curriculum
10Simplified Theory of Action for Example
Recruitment, Retention Activities
Family, Community Involvement
Leadership
MSP Funding, Intervention
Student Achievement
Professional Development
Student Learning
Teacher Knowledge, Practice
DistrictResources
Curriculum
11Implementation Evaluation
- Definition (Scriven, 1991) mere monitoring of
program delivery - Definition (Frechtling, 2002, Gao, 1998) assess
whether the project is being conducted as
planned, e.g., fidelity of implementation - Ensure the program and its components are
operating, and according to the proposed plan or
description - Monitor and evaluate well-articulated activities
and processes in the black box - A process is a series of causally linked
events or changes taking place over time
(Scriven)
12Why Implementation Evaluation?
- Ensure that activities are implemented as PLANNED
in a timely manner. - Indicators are based on PLANS for project
activities--PLANS that - Explain the projects rationale
- Document the context in which a project operates
- Describe the planned activities and processes
- Identify potential side effects
13Implementation Evaluation
- Answers questions such as (Westat, 2003)
- Were the appropriate participants selected and
involved in the planned activities? - Do the activities and strategies match those
described in the plan? If not, are the changes in
activities justified and described? - Were the appropriate staff members hired and
trained, and are they working in accordance with
the proposed plan? Were the appropriate materials
and equipment obtained? - Were activities conducted according to the
proposed timeline? By appropriate personnel? - Was a management plan developed and followed?
14Models for Describing Monitoring
- Program Logic Modeling
- Picture of how a program works, including the
theory and assumptions underlying the program - Logic Model Development Guide
- W. K. Kellogg Foundation, http//www.wkkf.org
- Key Evaluation Checklist
- Checklist for evaluating/reporting on programs
evaluations of them - M. Scriven, http//www.wmich.edu/evalctr.checklist
s/kec.htm - Others
15Program Logic Modeling
- What?
- Systematic and visual method for presenting
relationships among program resources,
activities, and anticipated changes or results. - Why?
- Provides a road map describing the sequence
of related events/processes that connect the need
for the program with the desired results.
16The Importance of Logic Modeling
- Why programs often run into trouble
- Lack of well articulated, research-based,
experience-based theory or road map. - Failure to follow the road map during the trip!
- If program planners dont have any hypotheses
guiding them, their potential for success is
limited as is there no potential for learning
the program is probably in trouble! (1) - Why evaluations often run into trouble
- Lack of well articulated, research-based,
experience-based theory or road map. - The bane of evaluation is a poorly designed
program! (1) - (1)
Kellogg (2001) McLaughlin (2003)
17 University of Wisconsin-Extension, Program
Development Evaluation, http//www.uwex.edu/ces/
pdande/progdev/index.html
18 University of Wisconsin-Extension, Program
Development Evaluation, http//www.uwex.edu/ces/
pdande/progdev/index.html
19 Westat (2003). http//www.mspinfo.com/Source/
Chap9_Evidence_and_Evaluation.asp
20MSP Project Logic Models
- Show relationships, links between
- Resources (inputs) from NSF, Higher Education,
K-12, Partners - Activities and processes that will address MSP
five key characteristics - Outcomesshort, intermediate, and long term
- Complex!
- Nested, multiple levels or depths
- Require thoughtful, thorough, rigorous,
systematic planning and development
21Key Evaluation Checklist
- What?
- Checklist of necessary items to be addressed
(iteratively) in a program evaluation. - Why?
- Avoid invalidity in a program evaluation.
- Align proposal/plan and evaluation.
22Key Evaluation Checklist Components
- Description
- Background, context
- Consumers
- Resources
- Values
- Processes
- Outcomes
- Used for Implementation Evaluation
- Costs
- Comparisons with alternative options
- Generalizability
- Significance
- Recommendations
- Report
- Meta-evaluation
23Key Evaluation Checklist
- Background and Context
- Historical, contemporary, projected settings
- Stakeholders
- Relevant legislation, funders policy changes
- Underlying rationale (e.g. program theory,
political logic) - Review of previous research and evaluations
24Key Evaluation Checklist
- Description and Definitions
- Definitions of technical terms
- Official description of program and components
- Detailed description for replication
- Goals, mileposts, benchmarks
25Key Evaluation Checklist
- Processes
- Assessment of the quality of everything
significant that happens or applies before true
outcomes emerge - Causally relevant context and support
- Goals, design, degree of implementation,
management, quality of work, activities,
procedures - Quality of inputs (i.e., logic model resources)
- Intermediate results (i.e., logic model outputs)
26Key Evaluation Checklist Applied to MSP Projects
- Goes from
- Whats So?
- Step I Fact finding phase
- To So What?
- Step II Combining facts with values that bear
on those facts - Complex!
- Iterative, multi-step
- Requires thoughtful, thorough, rigorous,
systematic planning and development
27Complexity of Implementation Evaluation Models
- Implementation evaluation requires
- Accurate description of project contexts,
activities, processes, and the relationships
between them - Realistic benchmarks, measurable indicators
- Regular monitoring of project plans, activities,
processes, timelines - Complex!
- Nested designs with multiple levels or depths
- Iterative, multi-step methods for planning and
documentation - Require thoughtful, thorough, rigorous,
systematic planning, development, and
evaluation
28USUs MSP-RETA Project
- Provide evaluation technical assistance to MSP
projects - Collect evaluation needs assessment information
- Build upon existing evaluation models or
processes to develop evaluation processes that - Address the complexity of MSP projects
- Help identify and measure causal effects
- Incorporate relevant contextual factors
- Involve stakeholders
29Culture of Evidence
- In particular, we are working to help MSP
projects build a Culture of Evidence to meet
NSFs goal of identifying successful projects
that will serve as models that can be widely
replicated in educational practice to improve the
mathematics and science achievement of all the
Nation's students.
30References
- Frechtling, J. (2002). The 2002 user-friendly
handbook for project evaluation. Washington, DC
NSF. Document Number 02-057 - GAO. (1998). Performance measurement and
evaluation Definitions and relationships.
Washington, DC U.S. GAO. http//www.gao.gov/spe
cial.pubs/gg98026.pdf - McLaughlin, J.A. (October, 2003). Logic
modeling A tool for describing and aligning your
program to your monitoring and evaluation. A
presentation at USUs MSP Building Evaluation
Capacity of STEM/MSP Projects Workshop,
Baltimore, MD. - Scriven, M. (1991). Evaluation thesaurus, 4th
ed. Newbury Park, CA Sage. - Scriven, M. (2002). Key evaluation checklist.
Kalamazoo, MI Western Michigan University, The
Evaluation Center. http//www.wmich.edu/evalctr/
checklists/kec.htm - University of Wisconsin-Extension, Program
Development and Evaluation. (2002). Enhancing
program performance with logic models. Madison,
WI Author. http//www.uwex.edu/ces/pdande/ and
http//www1.uwex.edu/ces/lmcourse/ - W. K. Kellogg Foundation. (2001). Logic model
development guide. Battle Creek, MI Author. - Westat, Inc. (2003). Developing math and
science partnerships Toolkit. Rockville, MD
Author. http//www.mspinfo.com/Source/toolkit.as
p
31Contact Information
- USUs MSP-RETA Evaluation Capacity Building
Project - PI, Project Director Catherine Callow-Heusser
(cheusser_at_cc.usu.edu) - Co-PI Jim Dorward (jimd_at_cc.usu.edu)
- Co-PI Steve Lehman (s.lehman_at_usu.edu)
- PI (retired) Blaine Worthen
- Consortium for Building Evaluation Capacity
- http//www.usu.edu/cbec/
- 2810 Old Main Hill 435-797-1111
- Utah State University FAX 435-797-1448
- Logan, UT 84322-2810 cbec_at_cc.usu.edu