Title: Summative Program Evaluations
1Summative Program Evaluations
Used with the permission of John R. Slate
2Definitions
- The use of data to determine the effectiveness of
a unit, course, or program AFTER it has been
completed. - An evaluation that provides information about the
overall effectiveness, impact, and/or outcomes of
a program.
3The goal of summative evaluation . . .
- to collect and present information needed for
summary statements and judgments about the
program and its value.
4Used in . . .
- making terminal end-of-experience judgments of
- worth value
- appropriateness of the experience
- goodness
- assessing the end results of an experience
5Examples
- Termination of employment
- Final grade in a course
- Final report for a program that is ending
- Board report
6The role of the evaluator
- to provide findings about the program which can
be generalized to other contexts beyond the
program being evaluated. - to focus the evaluation on the primary features
and outcomes of the program and on the policy
questions which may underlie the program.
7- to educate the audience about what constitutes
good and poor evidence of program success. - to admonish the audience about the foolishness of
basing important decisions on a single study. - to be a program advocate when merited.
8- to convey to your audience as complete a
depiction of the programs crucial
characteristics as possible. - to express opinions about the quality of the
program. - to be able to defend your conclusions.
9Steps in Summative Evaluation
10Phase A Set the Boundaries of the Evaluation
- Research the program
- Encourage Trust, Cooperation, and Ownership
- Identify the audience/stakeholders
- Identify programmatic goals
11Step 1 Identify the Sponsors/Audiences
12Questions of sponsors audiences . . .
- Is the program worth continuing?
- How effective was it?
- What does the program look like and accomplish?
- What conclusions could you draw about program
effectiveness?
13Step 2 Find out as much as you can about the
program.
- Collect and scrutinize written documents that
describe the program. - Talk to people
14Questions on the mind of the evaluator . . .
- What are the goals and objectives of the program?
- Does the program lead to goal achievement?
- How effective is the program?
- Are there more effective alternative programs
available?
- What are the most important characteristics,
activities, services, staffing, and
administrative arrangements of the program? - Did the planned program occur?
15Questions to be asked of the stakeholders . . .
- What are the most important outcomes of the
program, including planned, serendipitous, and
unanticipated? - Which aspects of the program do you think wield
greatest influence in producing program outcomes? - What are the most important organizational and
administrative aspects of the program?
16- Which parts of the program do you consider its
most distinctive characteristics, those that make
it unique among programs of its kind? - With what types of students/clients,
participants, staff do you think the program is
most/least effective?
17- What is the theory of action behind the program?
- What are the policy alternatives if the program
is found effective? - How much expansion is possible?
- How might expansion sites be selected?
- What are the possible markets, communities, or
sites for future expansion?
18- What are the policy alternatives if the program
is found ineffective? - Would the program be cutback, eliminated, and/or
refined?
19- Step 3 Develop a written description of the
program as you understand it.
20Step 4 Focus the evaluation
- Judge the adequacy of your written documents for
describing the program - Visualize what you might do as the evaluator
- Assess your own strengths and preferences
21Step 5 Negotiate your role
- Agree generally about the basic outline of the
evaluation - Verify with the evaluation sponsor your general
agreement about services and responsibilities
22Phase B Select Appropriate Evaluation Methods
- Establish a common understanding with your study
sponsor(s) and program staff about the purposes
of the evaluation and about the nature of the
activities.
23Step 1 Data Collection
- Determine appropriate sources for data collection
- Select data collection instruments
- Develop instruments where necessary
24Step 2 Consolidate your concerns
- Time
- Money
- Availability of data collection sources
- Availability of staff and/or students/clients
25Step 3 Plan the construction and purchase of
instruments
- Schedule, schedule, schedule
- Field-testing
26Step 4Plan the data analysis you will perform
- Mostly quantitative
- SPSS
- Mostly qualitative
- Themes
27Step 5 Choose evaluation design
- True Control Group
- Identify all participants
- Pretest all participants
- Randomly divide participants into one of two
groups (Control or Experimental) - Avoid confounding and contaminating variables
- Posttest both groups simultaneously.
28- True Control Group with Posttest only
- Same as True Control Group, BUT no pretest is
given. - Hope that randomization ensures equivalence of
groups.
29- Non-equivalent Control Group
- Find a group similar to your experimental group
to serve as the control - Pretest both groups
- Investigate differences
- Posttest both groups
30- Single Group Time Series
- Collection of scores from same group
- Several occasions prior to experiment
- Several occasions during experiment
- Several occasions after experiment
31- Time Series with Non-Equivalent Control Groups
- Not randomly assigned
- Same procedure as above with both groups
32- Before and After Design
- Informal comparisons
- Compare experimental group with national sample
norms - Examine school records
- Examine on predetermined standards
33- Step 6 Choose sampling strategy for conducting
data collection - Step 7 Estimate the cost of the evaluation
- Step 8 Come to final agreement about services
and responsibilities
34Phase C Collect and Analyze Information
- Step 1 Set deadlines
- Step 2 Set up the evaluation design
- Step 3 Administer instruments, score, and
record data - Step 4 Conduct the data analysis
35Phase D Reporting the Findings
- Step 1 Plan the report
- Step 2 Choose a method of presentation
36Be careful to . . .
- apply standards and criteria appropriately.
- use valid and reliable instruments.
- be objective not subjective (formative).
- make sure that program implementation has been
completed.
37Realize that . . .
- the program documentation that you generate may
be used for accountability, creating a lasting
description/impression of the program, and/or
creating a list of the possible causes of program
effects.
38The critical characteristic is
- to provide the best possible information that
could have been collected under the
circumstances, and that this information meet the
credibility requirements of the audience.
39In closing, remember . . .
- The summative evaluation is most often conducted
at the conclusion of the program to provide
potential consumers with judgments about the
programs worth or merit. - The more skeptical your audience, the greater the
necessity for providing formal backup data.