Collaborative Strategies and Practices That Ground Robust Assessment of Student Learning and Develop - PowerPoint PPT Presentation

1 / 45
About This Presentation
Title:

Collaborative Strategies and Practices That Ground Robust Assessment of Student Learning and Develop

Description:

Relies on active verbs (create, compose, calculate) ... Students will compose a range of professional documents designed to solve ... – PowerPoint PPT presentation

Number of Views:110
Avg rating:3.0/5.0
Slides: 46
Provided by: oregon
Category:

less

Transcript and Presenter's Notes

Title: Collaborative Strategies and Practices That Ground Robust Assessment of Student Learning and Develop


1
Collaborative Strategies and Practices That
Ground Robust Assessment of Student Learning and
Development
  • Presented by
  • Peggy Maki, Ph.D.
  • at
  • Oregon State University
  • November, 2004

2
Positions of Inquiry into Our Students Learning
  • Pedagogy
  • Curricular design
  • Instructional design
  • Educational tools

3
  • Educational experiences
  • Students learning histories/styles
  • Methods to capture learning--assessment

4
Collective Articulation of Learning Outcome
Statements
  • List the desired kinds of knowledge,
    abilities,
  • habits of mind, ways of knowing, and
  • dispositions that you desire your students to
  • demonstrate by the time they graduate from
  • OSU

5
What Is a Learning Outcome Statement?
  • Describes learning desired within a context
  • Relies on active verbs (create, compose,
    calculate)
  • Emerges from our collective intentions
    over time

6
  • Can be mapped to curricular and co-curricular
    practices (ample, multiple and varied
    opportunities to learn over time)
  • Can be assessed quantitatively or qualitatively
    during students undergraduate and graduate
    careers

7
  • Is written for a course, program, or institution

8
Levels of Learning Outcome Statements
9
Distinguishing between Objectives and Outcomes
  • Objectives state overarching expectations such
    as
  • Students will develop effective oral
  • communication skills.
  • OR
  • Students will understand different
  • economic principles.

10
Example from ABET
  • Design and conduct experiments analyze and
    interpret data

11
Compare
  • Students will write
  • effectively.

to
  • Students will compose a range of
    professional documents designed to solve
    problems for different audiences and purposes.

12
Example from ACRL
  • Literate student evaluates information and
    its sources critically and incorporates selected
    information into his or her knowledge and value
    system.
  • ONE OUTCOME Student examines and compares
    information from various sources in order to
    evaluate reliability, validity,accuracy,
    timeliness, and point of view or bias.

13
Ways to Articulate Outcomes
  • Adapt from professional organizations
  • Derive from mission of institution/program/departm
    ent/service
  • Derive from students work

14
  • Derive from ethnographic process (geography
    example)
  • Derive from exercise focused on listing one or
    two outcomes you attend to

15
Characteristics of A Good Outcomes Statement
  • Describes learning desired within a context
  • Relies on active verbs (analyze, create, compose,
    calculate, construct)
  • Emerges from our collective intentions
    over time

16
  • Can be mapped to curricular and co-curricular
    practices (ample, multiple and varied
    opportunities to learn over time)
  • Can be assessed quantitatively or qualitatively
    during students undergraduate and graduate
    careers

17
Development of Maps and Inventories
  • Reveal how we translate outcomes statements into
    educational practices offering students multiple
    and diverse opportunities to learn
  • Help us to identify appropriate times to assess
    those outcomes
  • Identify gaps in learning or opportunities to
    practice

18
Question How Will You Use Maps and Inventories?
  • How will go about the process of developing a
    curricular or curricular-co-curricular map and
    how might you label peoples entries?
  • How you might use inventories?

19
Question How might students benefit from these
maps?
Upon matriculation? Along the chronology of
their learning? In the advising process? To
foster learning (such as through self-reflection?
20
  • Help students understand our expectations of them
  • Place ownership of learning on students
  • Enable them to develop their own maps or learning
    chronologies

21
Development of Assessment Methods
22
  • Every assessment is .based on a set of
    beliefs about the kinds of tasks or situations
    that will prompt students to say, do, or create
    something that demonstrates important knowledge
    and skills. The tasks to which students are asked
    to respond on an assessment are not arbitrary.
  • National Research Council. Knowing what
    students know The science and design of
    educational assessment . Washington, D.C.
    National Academy Press, 2001, p. 47.

23
Assumptions Underlying Teaching
Actual Practices
Assumptions Underlying Assessment Tasks
Actual Tasks
24
Approaches to Learning
  • Surface Learning
  • Deep Learning

25
When Do You Seek Evidence?
  • Formativealong the way?
  • For example, to ascertain progress
  • or development
  • Summativeat the end?
  • For example, to ascertain mastery level of
    achievement

26
Direct Methods
  • Focus on how students represent or demonstrate
    their learning (meaning making)
  • Align with students learning and assessment
    experiences
  • Align with curricular-and co-curricular design
  • verified through mapping
  • Provide evidence of how students make meaning

27
  • Invite collaboration in design (faculty,
    students, TAs, tutors)

28
Some Options
  • EPortfolios
  • Capstone projects (mid-point and end-point)
  • Performances, productions, creations
  • Visual representations (mind mapping, charting,
    graphing)

29
  • Disciplinary or professional practices
  • Agreed upon embedded assignments
  • Writing to speaking to visual presentation

30
  • Team-based or collaborative projects
  • Internships and service Projects
  • Oral examinations
  • Critical incidents

31
  • Externally or internally juried review of student
    projects
  • Externally reviewed internship
  • Performance on a case study/problem
  • Performance on case study accompanied with
    students analysis

32
  • Performance on national licensure examinations
  • Locally developed tests
  • Standardized tests
  • Pre-and post-tests

33
  • Mapping (mind maps or concept maps)
  • Learning Logs or Journals
  • Self-reflection

34
Development of Standards and Criteria of Judgment
  • A set of criteria that identifies the expected
    characteristics of a text and the levels of
    achievement along those characteristics. Scoring
    rubrics are criterion-referenced, providing a
    means to assess the multiple dimensions of
    student learning.
  • Are collaboratively designed based on how and
    what students learn (based on curricular-co-curric
    ular coherence)

35
  • Are aligned with ways in which students have
    received feedback
  • (students learning histories)
  • Students use them to develop work and to
    understand how their work meets standards (can
    provide a running record of achievement).

36
  • Raters use them to derive patterns of student
    achievement to identify strengths and weaknesses

37
Interpretation through Scoring Rubrics
  • Criteria descriptors (ways of thinking, knowing
    or behaving represented in work)
  • Creativity
  • Self-reflection
  • Originality
  • Integration
  • Analysis
  • Disciplinary logic

38
  • Criteria descriptors (traits of the performance,
    work, text)
  • Coherence
  • Accuracy or precision
  • Clarity
  • Structure

39
  • Performance descriptors (describe well students
    execute each criterion or trait along a continuum
    of score levels)
  • ExemplaryCommendable Satisfactory-
    Unsatisfactory
  • ExcellentGoodNeeds ImprovementUnacceptable
  • ExpertPractitionerApprentice--Novice

40
Pilot-testing the Scoring Rubric
  • Apply to student work to assure you have
    identified all the dimensions with no overlap
  • Schedule inter-rater reliability times
  • -independent scoring
  • -comparison of scoring
  • -reconciliation of responses
  • -repeat cycle

41
Collaborative Interpretation
  • Disciplinary work groups
  • Cross-disciplinary work groups
  • Formal opportunities to share program-level
    findings at the institution-level opportunities
    to share institution-level findings at the
    program-level

42
  • Seek patterns against criteria and cohorts
  • Build in institutional level and program
  • level discourse
  • Tell the story that explains the results
  • triangulate with other data
  • Be able to aggregate and disaggregate data
    to guide focused interpretation
  • Determine what you wish to change, revise,
  • or how you want to innovate

43
Examples of Changes
  • Increased attention to weaving experiences across
    the institution, a program, or a department to
    improve student achievement
  • Changes in advising based on assessment results
  • Closer monitoring of student achievement--tracking

44
  • Faculty and staff development to learn how to
    integrate experiences that contribute to improved
    student learning
  • Changes in pedagogy and curricular and
    co-curricular design
  • Development of modules to assist learning use of
    technology self-paced learning, supplemental
    learning

45
Gather Evidence
Interpret Evidence
Mission/Purposes Learning Outcomes
How well do we achieve our outcomes?
Enhance teaching/ learning and institutional
planning/budgeting
Write a Comment
User Comments (0)
About PowerShow.com