Title: How to Develop a Project Evaluation Plan
1How to Develop a Project Evaluation Plan
- Pat Gonzalez
- Office of Special Education Programs
- Patricia.Gonzalez_at_ed.gov
- 202-245-7355
2Initial Steps in Developing the Outcome
Evaluation Plan
- Identify the program or projects mission and/or
goals and objectives - Identify all relevant and important outcomes that
should be evaluated - Select outcome indicators
- Identify data sources and data collection
procedures
3Step 1 Identify mission and/or goals and
objectives
- Clarify the expectations and priorities of key
stakeholders/collaborators - Get a reasonable level of agreement on goals,
strategies or activities and outcomes - Develop Goals Broad statements generally
describing desired outcomes - Develop Objectives Measurable statements about
outcomes (target performance) expected to be
accomplished in a given time frame
4More on Objectives
-
- Objectives require detail and must include a
target group (who), what is to be done
(activities), a time frame (when), and a target
performance (how much). - 80 (how much) of the 300 participating
teachers (who) will indicate that the Transition
Toolkit is useful, relevant and of high quality
(what) on the second year follow-up survey (when).
5Step 2 Identify relevant and important project
outcomes
- Short-term Outcomes typically involve learning
awareness, knowledge, attitudes, skills - Intermediate Outcomes typically involve action
behavior, practice, policies - Long-term Outcomes typically involve conditions
social, economic, civic, environmental
6REMEMBER!
- Focus on Short and Intermediate-term outcomes
that can be completed within the grant period
7Outcomes are not Outputs!
- Outputs are the direct products of program
activities, usually measured by volume, such as
the number of classes taught or number of
participants served
8Sources of information on Program Outcomes
- Legislation and regulations
- Purpose statements contained in the RFP
- Strategic plans, SPPs or APRs
- State data systems
- Program descriptions or annual reports
- Discussions with stakeholders or collaborators
- Complaint information
- Performance measures from government agencies or
other programs
9Step 3 Select outcome indicators
- Each identified outcome needs to be translated
into one or more outcome indicators that state
specifically what is to be measured (e.g., the
of teachers passing a test)
10Checklist for Outcome Indicators
- Does each indicator measure some important aspect
of the outcome? - Does each indicator start with a numerical
designation such as incidence, percentage,
rate, or proportion of? - Does your list of indicators cover all your
outcomes?
11Checklist for Outcome Indicators-continued
- Does your list indicators cover quality
characteristics, such as timeliness of services? - Is the wording of your indicator sufficiently
specific? Avoid terms like appropriate. - What is the feasibility and cost of collecting
the indicator? Note that sometimes costly
indicators are the most important and should be
retained.
12Step 4 Identify data sources and data
collection procedures
- Determine if a research design can be used to
evaluate effectiveness. There are several
quasi-experimental designs that can be readily
applied to program evaluation. - Identify data sources, such as extant
agency/program records, performance assessments,
surveys, observer ratings, and interview data.
13Considerations in Determining Data Collection
Procedures
- When will data be collected? Consider your
design - When entering the program
- When completing the program
- Fixed interval after entering
- Fixed interval after completing
- Combination of above
14Considerations in Determining Data Collection
Procedures-continued
- Who is considered a participant?
- Include all participants or a sample?
- Who will collect the data?
- How will the data be analyzed?
15The Evaluation PlanImplementation Questions
- Using the information concerning, goals,
objectives, strategies/activities and outcomes,
develop evaluation questions on implementation - Were the activities completed as intended, on
time and did they result in the planned outputs?
These questions provide a feedback loop for the
purposes of on-going project monitoring.
16The Evaluation PlanOutcome Questions
- Using the information concerning, goals,
objectives, strategies/activities and outcomes,
develop evaluation questions on
impact/effectiveness - How well did the activities address the
objectives as measured by the indicators? What
changed for the target group either over time or
in comparison to another group?
17The Evaluation PlanMethods
- For each evaluation question
-
- Is a research design feasible/which ones?
- What are the data sources?
- What methods will be used to collect the data?
- How might the data be analyzed and reported?
18The Evaluation PlanTimelines
- For each evaluation question
-
- When will data be collected?
- When will data be reported or used?
-
19The Evaluation PlanPersonnel Responsible
- Who is responsible for data collection, analysis
and reporting at each point in the timeline?
20SPDG Evaluation Criteria
- The extent to which the methods of evaluation are
thorough, feasible, and appropriate to the goals,
objectives and outcomes of the proposed project. - The extent to which the methods of evaluation
provide for examining the effectiveness of
project implementation strategies.
21SPDG Evaluation Criteria - continued
- The extent to which the methods of the evaluation
include the use of the objective performance
measures that are clearly related to intended
outcomes of the project and will produce
quantitative and qualitative data to the extent
possible.
22SPDG Evaluation Criteria - continued
- The extent to which the methods of evaluation
will provide performance feedback and permit
assessment of progress toward achieving intended
outcomes
23Performance Measures
- The percent of personnel receiving professional
development through the SPDG Program based on
scientific- or evidence-based instructional
practices. - The percent of SPDG projects that have
implemented personnel development/training
activities that are aligned with improvement
strategies identified in their State Performance
Plan.
24Performance MeasuresContinued
- The percent of professional development/training
activities provided through the SPDG Program
based on scientific- or evidence-based
instructional/behavioral practices. - The percent of professional development/training
activities based on scientific- or evidence-based
instructional/behavioral practices, provided
through the SPDG Program, that are sustained
through ongoing and comprehensive practices
(e.g., mentoring, coaching, structured guidance,
modeling, continuous inquiry, etc.).
25Performance MeasuresContinued
- In States with SPDG projects that have special
education teacher retention as a goal, the
Statewide percent of highly qualified special
education teachers in State-identified
professional disciplines (e.g., teachers of
children with emotional disturbance, deafness,
etc.), consistent with sections 602(a)(10) and
612(a)(14) of IDEA, who remain teaching after the
first three years of employment.
26Performance MeasuresContinued
- The percent of SPDG projects that successfully
replicate the use of scientific or evidence-based
instructional/behavioral practices in schools. - The percent of SPDG projects whose cost per
personnel receiving professional development on
scientific or evidence-based practices is within
a specified range.