Integrating Student Learning into Program Review - PowerPoint PPT Presentation

1 / 26
About This Presentation
Title:

Integrating Student Learning into Program Review

Description:

Integrating Student Learning into Program Review Barbara Wright Associate Director, WASC bwright_at_wascsenior.org Assessment & Program Review: related but different ... – PowerPoint PPT presentation

Number of Views:124
Avg rating:3.0/5.0
Slides: 27
Provided by: uog4
Category:

less

Transcript and Presenter's Notes

Title: Integrating Student Learning into Program Review


1
Integrating Student Learning into Program Review
  • Barbara Wright
  • Associate Director, WASC
  • bwright_at_wascsenior.org

2
Assessment Program Review related but different
  • Program review typically emphasizes
  • Inputs, e.g.
  • Mission statement, program goals
  • Faculty, their qualifications
  • Students, enrollment levels, qualifications
  • Library, labs, technology, other resources
  • Financial support

3
Assessment Program Review related but
different, cont.
  • Program review typically emphasizes
  • Processes, e.g.
  • Faculty governance
  • Curriculum review
  • Planning
  • Follow-up on graduates
  • Budgeting
  • And yes, assessment may be one of these

4
Assessment Program Review related but
different, cont.
  • Program review typically emphasizes
  • indirect indicators of student learning and
    academic quality, e.g.
  • Descriptive data
  • Surveys of various constituencies
  • Existence of relationships, e.g. with area
    businesses, professional community
  • Program review has traditionally neglected actual
    student learning outcomes

5
Assessment Program Review related but
different, cont.
  • PR is typically conceived as
  • Data-gathering
  • Looking at the past 5-8 years
  • Reporting after the fact where the program has
    been
  • Using PR to garner resources or at least
    protect what program has
  • Projecting needs into the future
  • Expressing quality improvement in terms of
    a case for additional inputs

6
Capacity vs. Educational Effectivess for Programs
  • Capacity questions What does the program have in
    the way of inputs, processes, and evidence of
    outputs or outcomes? What does it need, and how
    will it get what it needs?
  • EE questions How effectively do the inputs and
    processes contribute to desired outcomes? How
    good are the outputs? The student learning?

7
Assessment Program Review related but different
  • Assessment is all about
  • Student learning improvement at individual,
    program institutional levels
  • Articulation of specific learning goals (as
    opposed to program goals, e.g. We will place 90
    of graduates in their field.)
  • Gathering of direct, authentic evidence of
    learning (as opposed to indirect evidence,
    descriptive data)

8
Assessment Program Review related but
different, cont.
  • Assessment is all about
  • Interpretation use of findings to improve
    learning thus strengthen programs (as opposed
    to reporting of data to improve inputs)
  • A future orientation Heres where we are and
    heres where we want to go in student learning
    over the next 3-5 years
  • Understanding the learning problem before
    reaching for a solution

9
Assessment Program Review related but
different, cont.
  • Assessment of student learning and program review
    are not the same thing. However, there is a place
    for assessment as a necessary and significant
    input in program review. We should look for
  • A well-functioning process
  • Key learning goals
  • Standards for student performance
  • A critical mass of faculty (and students)
    involved
  • Verifiable results, and
  • Institutional support

10
The Assessment Loop
1. Goals, questions
4. Use
2. Gathering evidence
3. Interpretation
11
The Assessment Loop Capacity Questions
4. Is there a process for use of findings for
improvement? Is there admin. support, planning,
budgeting? Rewards for faculty?
1. Does the program have student learning goals,
questions?
3. Do they have a process for systematic,
collective analysis and interpretation of
evidence?
2. Do they have methods, processes for gathering
evidence? Do they have evidence?
12
The Assessment Loop Effectiveness Questions
4. What is the quality of follow-through on
findings for improve-ment? Is there improvement?
How adequate, effective are admin. support,
planning, budgeting? Rewards for faculty?
1. How well do they achieve their student
learning goals, answer questions?
3. How well do processes for systematic,
collective analysis and interpretation of
evidence work? What have they found?
2. How aligned are the methods? How effective are
the processes? How complete is the evidence?
13
Dont confuse program-level assessment and
program review
  • Program-level assessment means we look at
    learning on the program level (not just
    individual student or course level) and ask what
    all the learning experiences of a program add up
    to, at what standard of performance (results).
  • Program review looks for program-level assessment
    of student learning but goes beyond it, examining
    other components of the program (mission,
    faculty, facilities, demand, etc.)

14
What does WASC want? Both!
  • Systematic, periodic program review, including a
    focus on student learning results as well as
    other areas (inputs, processes, products,
    relationships)
  • An improvement-oriented student learning
    assessment process as a routine part of the
    programs functioning

15
Institutionalizing Assessment 2 aspects
  • The PLAN for assessment (i.e. shared definition
    of the process, purpose, values, vocabulary,
    communication, use of findings)
  • The STRUCTURES and RESOURCES that make the plan
    doable

16
How to institutionalize --
  • Make assessment a freestanding function
  • Attach to an existing function, e.g.
  • Accreditation
  • Academic program review
  • Annual reporting process
  • Center for Teaching Excellence
  • Institutional Research

17
Make assessment freestanding --
Positives and
Negatives
  • Maximum flexibility
  • Minimum threat, upset
  • A way to start
  • Little impact
  • Little sustainability
  • Requires formalization eventually, e.g. Office of
    Assessment

18
Attach to Office of Institutional Research --
Positives and
Negatives
  • Strong data gathering and analysis capabilities
  • Responds to external expectations
  • Clear responsibility
  • IR has resources
  • Faculty not burdened
  • Perception assessment data gathering
  • Faculty see little or no responsibility
  • Faculty uninterested in reports
  • Little or no use of findings

19
Attach to Center for Teaching Excellence --
Positives and
Negatives
  • Strong impact possible
  • Ongoing, supported
  • Direct connection to faculty, classroom, learning
  • Chance for maximum responsiveness to use phase
  • Impact depends on how broadly assessment is done
  • No enforcement
  • Little/no reporting, communicating
  • Rewards, recognition vary, may be lip service

20
Attach to annual report --
Positives and
Negatives
  • Some impact (depending on stakes)
  • Ongoing
  • Some compliance
  • Habit, expectation
  • Closer connection to classroom, learning
  • Cause/effect possible
  • Allows flexibility
  • Impact depends on how seriously, how well AR is
    done
  • No resources
  • Reporting, not improving, unless specified
  • Chair writes faculty involvement varies

21
Attach to accreditation --
Positives and
Negatives
  • Maximum motivation
  • Likely compliance
  • Resources available
  • Staff, faculty assigned
  • Clear cause/effect
  • Resentment of external pressure
  • Us/them dynamic
  • Episodic, not ongoing
  • Reporting, gaming, not improving
  • Little faculty involvement
  • Little connection to the classroom, learning
  • Main focus inputs, process

22
Attach to program review --
Positives and
Negatives
  • Some impact (depending on stakes)
  • Some compliance
  • Some resources available
  • Staff, faculty assigned
  • Cause/effect varies
  • Impact depends on how seriously, how well PR is
    done
  • Episodic, not ongoing
  • Inputs, not outcomes
  • Reporting, not improving
  • Generally low faculty involvement
  • Anxiety, risk-aversion
  • Weak connection to the classroom, learning

23
How can we deal with the disadvantages?
  • Strong message from administration PR is
    serious, has consequences (bad and good)
  • Provide attentive, supportive oversight
  • Redesign PR to be continuous
  • Increase weighting of assessment in overall PR
    process increase
  • Involve more faculty, stay close to classroom,
    program
  • Focus on outcomes, reflection, USE Focus on
    improvement (not just good news) and REWARD IT

24
How can we increase weighting of learning
assessment in PR? E.g.,
From
to
  • Optional part
  • One small part of total PR process
  • Assessment vague, left to program
  • Various PR elements of equal value (or no value
    indicated)
  • Little faculty involvement
  • Required
  • Core of the process (so defined in instructions)
  • Assessment expectations defined
  • Points assigned to PR elements student learning
    gets 50 or more
  • Broad involvement

25
Assessment serves improvement and accountability
  • A well-functioning assessment effort
    systematically improves curriculum, pedagogy, and
    student learning this effect is documented.
  • At the same time,
  • The presence of an assessment effort is an
    important input indicator of quality,
  • The report on beneficial effects of assessment
    serves accountability and
  • Assessment findings support requests

26
New approaches to PR/assessment
  • Create a program portfolio
  • Keep program data continuously updated
  • Do assessment on annual cycle
  • Enter assessment findings, uses, by semester or
    annually
  • For periodic PR, review portfolio and write
    reflective essay on student AND faculty learning
Write a Comment
User Comments (0)
About PowerShow.com