Title: Strategies for Course Redesign Evaluation
1Strategies for Course Redesign Evaluation
- Laura M. Stapleton
- Human Development and Quantitative Methodology
- University of Maryland, College Park
- Lstaplet_at_umd.edu
2Presentation Outline
- Goal of evaluation
- A proposed framework for evaluation
- Experimental design considerations
- Examples of what (and what not) to do
- Summary and recommendations
3Goal of Evaluation
- to provide useful information for judging
decision alternatives, assisting an audience to
judge and improve the worth of some educational
program, and assisting the improvement of
policies and programs - (Stufflebeam, 1983)
4Frameworks
- Summative-judgment orientation (Scriven, 1983)
- OR
- Improvement orientation (Stufflebeam, 1983)
-
The most important purpose of program evaluation
is not to prove but to improve
5Proposed Framework
- Stufflebeams CIPP framework for program
evaluation - Context
- Inputs
- Process
- Product
-
- Evaluation can encompass any or every one of
these aspects
6Context Evaluation
- What needs are addressed, how pervasive and
important are they, and to what extent are the
projects objectives reflective of assessed
needs? - What course is undergoing redesign?
- Why is it targeted for redesign?
- Are these reasons sufficient for the
resource/time expenditure that redesign would
require? - Are there components of the traditional course
that are good/important to keep? - Does redesign represent potential benefits?
7Input Evaluation
- What procedural plan was adopted to address the
needs and to what extent was it a reasonable,
potentially successful, and cost effective
response to the assessed needs? - Why were the specific redesign components
selected? - What else might have worked just as well?
- What are the costs of the chosen approach versus
the costs of others (to all stakeholders)?
8Process Evaluation
- To what extent was the project plan implemented,
and how, and for what reasons did it have to be
modified? - Was each part of the plan in place?
- Did the components operate as expected?
- Did the expected behavioral change occur?
- How can implementation efforts be improved?
9Product Evaluation
- What results were observed, how did the various
stakeholders judge the worth and merit of the
outcomes, and to what extent were the needs of
the target population met? - How did outcomes compare to past/traditional
delivery? - What were stakeholders opinions of the change?
- Were the cost/benefit advantages realized?
- Were there unintended consequences?
10Proposed Framework
- Stufflebeams CIPP framework for program
evaluation - Context
- Inputs
- Process
- Product
11Process Evaluation Strategies
- Observations
- Review of extant process data
- Focus groups
- Informal or formal feedback
12Product Evaluation Strategies
- Qualitative review of judgments of stakeholders
- Quantitative comparison of measured outcomes
- Causal conclusions regarding quantitative
outcomes depends on design (Campbell Stanley,
1963)
13Study Design
Pre-test post-test control group design
sample
Re-designed Instruction
Group A
pre-test
post-test
Traditional Instruction
Group B
pre-test
post-test
with this design, you have strong support for a
causal statement
14Study Design
Post-test only control group design
sample
Re-designed Instruction
Group A
post-test
Traditional Instruction
Group B
post-test
with this design, you have support for a causal
statement, assuming students do not drop out
differentially
15Study Design
Non-equivalent control group design
Re-designed Instruction
Group A
pre-test
post-test
Traditional Instruction
Group B
pre-test
post-test
with this design, initial differences in the
groups may explain differences (or lack of
differences) in the outcomes
16Study Design
Static group comparison / post-test only with
non-equivalent groups
Re-designed Instruction
Group A
post-test
Traditional Instruction
Group B
post-test
with this design, initial differences in the
groups may explain differences (or lack of
differences) in the outcomes
17Examples of what to do (and not do)
- UMBC Psychology Course Redesign
- Context Low pass rates in PSYC100 course
student course evaluations were brutal - Input Delivery of PSYC100 course content altered
- Material on web self-paced labs and quizzes
- Dyads within lecture hall with peer facilitators
- Lectures were more discussion based, including
video and clicker questions - Less time in lecture, more on self-paced on-line
work
18Examples of what to do (and not do)
- Process Evaluation (of redesign pilot)
- Lab utilization statistics
- Time lab completed (relative to exam and speed)
- Number of times quiz attempted
- Qualitative reaction from redesign section
instructors - Focus group comments from students
- Reactions from small groups in lectures
- What was working
- What was not working
- What changes would be helpful
19Examples of what to do (and not do)
- Product Evaluation (of redesign pilot)
- Post-test Only Static Group Comparison
- Grade Distribution
- Redesign
- Traditional (same semester)
- Traditional (historical)
- Common Exam
- Redesign
- Traditional (same semester)
- Student Course Evaluations
- Redesign
- Traditional (same semester) ?
20Summary Suggestions
- Identify a fairly independent evaluator now
- Determine what type of evaluation you need to
undertake which CIPP stage(s)? - Identify components of each (remember unintended
consequences) - Make it happen when it needs to happen!
- Be creative in considering sources of data
- Be flexible to change your evaluation plan
mid-stream - Think long-term as well as short-term
21References
- Campbell, D.T. Stanley, J. C. (1963).
Experimental and Quasi-Experimental Designs for
Research. Chicago, IL Rand McNally Company. - Scriven, M. (1983). Evaluation ideologies. In
Madaus, G. F., Scriven, M. Stufflebeam, D. L.
(Eds.) Evaluation Models, pp. 229-260. Hingham,
MA Kluwer Academic Publishers. - Stufflebeam, D. L. (1983). The CIPP model for
program evaluation. In Madaus, G. F., Scriven, M.
Stufflebeam, D. L. (Eds.) Evaluation Models,
pp. 117-142. Hingham, MA Kluwer Academic
Publishers.
22Thank you! Contact for info Lstaplet_at_umd.edu