Strategies for Course Redesign Evaluation - PowerPoint PPT Presentation

1 / 22
About This Presentation
Title:

Strategies for Course Redesign Evaluation

Description:

The CIPP model for program evaluation. In Madaus, G. F., Scriven, M. & Stufflebeam, D. L. (Eds.) Evaluation Models, pp. 117-142. Hingham, MA: ... – PowerPoint PPT presentation

Number of Views:218
Avg rating:3.0/5.0
Slides: 23
Provided by: Tom4157
Category:

less

Transcript and Presenter's Notes

Title: Strategies for Course Redesign Evaluation


1
Strategies for Course Redesign Evaluation
  • Laura M. Stapleton
  • Human Development and Quantitative Methodology
  • University of Maryland, College Park
  • Lstaplet_at_umd.edu

2
Presentation Outline
  • Goal of evaluation
  • A proposed framework for evaluation
  • Experimental design considerations
  • Examples of what (and what not) to do
  • Summary and recommendations

3
Goal of Evaluation
  • to provide useful information for judging
    decision alternatives, assisting an audience to
    judge and improve the worth of some educational
    program, and assisting the improvement of
    policies and programs
  • (Stufflebeam, 1983)

4
Frameworks
  • Summative-judgment orientation (Scriven, 1983)
  • OR
  • Improvement orientation (Stufflebeam, 1983)

The most important purpose of program evaluation
is not to prove but to improve
5
Proposed Framework
  • Stufflebeams CIPP framework for program
    evaluation
  • Context
  • Inputs
  • Process
  • Product
  • Evaluation can encompass any or every one of
    these aspects

6
Context Evaluation
  • What needs are addressed, how pervasive and
    important are they, and to what extent are the
    projects objectives reflective of assessed
    needs?
  • What course is undergoing redesign?
  • Why is it targeted for redesign?
  • Are these reasons sufficient for the
    resource/time expenditure that redesign would
    require?
  • Are there components of the traditional course
    that are good/important to keep?
  • Does redesign represent potential benefits?

7
Input Evaluation
  • What procedural plan was adopted to address the
    needs and to what extent was it a reasonable,
    potentially successful, and cost effective
    response to the assessed needs?
  • Why were the specific redesign components
    selected?
  • What else might have worked just as well?
  • What are the costs of the chosen approach versus
    the costs of others (to all stakeholders)?

8
Process Evaluation
  • To what extent was the project plan implemented,
    and how, and for what reasons did it have to be
    modified?
  • Was each part of the plan in place?
  • Did the components operate as expected?
  • Did the expected behavioral change occur?
  • How can implementation efforts be improved?

9
Product Evaluation
  • What results were observed, how did the various
    stakeholders judge the worth and merit of the
    outcomes, and to what extent were the needs of
    the target population met?
  • How did outcomes compare to past/traditional
    delivery?
  • What were stakeholders opinions of the change?
  • Were the cost/benefit advantages realized?
  • Were there unintended consequences?

10
Proposed Framework
  • Stufflebeams CIPP framework for program
    evaluation
  • Context
  • Inputs
  • Process
  • Product

11
Process Evaluation Strategies
  • Observations
  • Review of extant process data
  • Focus groups
  • Informal or formal feedback

12
Product Evaluation Strategies
  • Qualitative review of judgments of stakeholders
  • Quantitative comparison of measured outcomes
  • Causal conclusions regarding quantitative
    outcomes depends on design (Campbell Stanley,
    1963)

13
Study Design
Pre-test post-test control group design
sample
Re-designed Instruction
Group A
pre-test
post-test
Traditional Instruction
Group B
pre-test
post-test
with this design, you have strong support for a
causal statement
14
Study Design
Post-test only control group design
sample
Re-designed Instruction
Group A
post-test
Traditional Instruction
Group B
post-test
with this design, you have support for a causal
statement, assuming students do not drop out
differentially
15
Study Design
Non-equivalent control group design
Re-designed Instruction
Group A
pre-test
post-test
Traditional Instruction
Group B
pre-test
post-test
with this design, initial differences in the
groups may explain differences (or lack of
differences) in the outcomes
16
Study Design
Static group comparison / post-test only with
non-equivalent groups
Re-designed Instruction
Group A
post-test
Traditional Instruction
Group B
post-test
with this design, initial differences in the
groups may explain differences (or lack of
differences) in the outcomes
17
Examples of what to do (and not do)
  • UMBC Psychology Course Redesign
  • Context Low pass rates in PSYC100 course
    student course evaluations were brutal
  • Input Delivery of PSYC100 course content altered
  • Material on web self-paced labs and quizzes
  • Dyads within lecture hall with peer facilitators
  • Lectures were more discussion based, including
    video and clicker questions
  • Less time in lecture, more on self-paced on-line
    work

18
Examples of what to do (and not do)
  • Process Evaluation (of redesign pilot)
  • Lab utilization statistics
  • Time lab completed (relative to exam and speed)
  • Number of times quiz attempted
  • Qualitative reaction from redesign section
    instructors
  • Focus group comments from students
  • Reactions from small groups in lectures
  • What was working
  • What was not working
  • What changes would be helpful

19
Examples of what to do (and not do)
  • Product Evaluation (of redesign pilot)
  • Post-test Only Static Group Comparison
  • Grade Distribution
  • Redesign
  • Traditional (same semester)
  • Traditional (historical)
  • Common Exam
  • Redesign
  • Traditional (same semester)
  • Student Course Evaluations
  • Redesign
  • Traditional (same semester) ?

20
Summary Suggestions
  • Identify a fairly independent evaluator now
  • Determine what type of evaluation you need to
    undertake which CIPP stage(s)?
  • Identify components of each (remember unintended
    consequences)
  • Make it happen when it needs to happen!
  • Be creative in considering sources of data
  • Be flexible to change your evaluation plan
    mid-stream
  • Think long-term as well as short-term

21
References
  • Campbell, D.T. Stanley, J. C. (1963).
    Experimental and Quasi-Experimental Designs for
    Research. Chicago, IL Rand McNally Company.
  • Scriven, M. (1983). Evaluation ideologies. In
    Madaus, G. F., Scriven, M. Stufflebeam, D. L.
    (Eds.) Evaluation Models, pp. 229-260. Hingham,
    MA Kluwer Academic Publishers.
  • Stufflebeam, D. L. (1983). The CIPP model for
    program evaluation. In Madaus, G. F., Scriven, M.
    Stufflebeam, D. L. (Eds.) Evaluation Models,
    pp. 117-142. Hingham, MA Kluwer Academic
    Publishers.

22
Thank you! Contact for info Lstaplet_at_umd.edu
Write a Comment
User Comments (0)
About PowerShow.com