Cultures of Assessment Promoting Teaching and Learning - PowerPoint PPT Presentation

1 / 92
About This Presentation
Title:

Cultures of Assessment Promoting Teaching and Learning

Description:

Cultures cannot be consciously created -- they can be promoted, but an ... abstract, advertisement, annotated bibliography, biography, briefing, brochure, ... – PowerPoint PPT presentation

Number of Views:64
Avg rating:3.0/5.0
Slides: 93
Provided by: it8350
Category:

less

Transcript and Presenter's Notes

Title: Cultures of Assessment Promoting Teaching and Learning


1
Cultures of Assessment Promoting Teaching and
Learning
  • Washtenaw Community College
  • August 2004
  • Susan Hatfield
  • Winona State University
  • SHatfield_at_winona.edu

2
Outline
  • Climates and Cultures of Assessment
  • Using Assessment to Improve Teaching and Learning

3
  • Climates and Cultures of Assessment

4
Climate
  • Meteorological metaphor
  • Seasonal
  • Changing
  • Uneven
  • Perpetuated by individuals
  • Something an institution HAS

5
Culture
  • Anthropological Metaphor
  • Deep rooted
  • Defended
  • Perpetuated by structures, policies, procedures,
    behaviors
  • What an organization IS

6
Culture
  • Cultures cannot be consciously created -- they
    can be promoted, but an organizations culture
    arises from the interaction of a number of
    variables.

7
Language of Assessment
  • A. General skill or knowledge category
  • GOAL
  • B. Specific accomplishments to be achieved
    STUDENT LEARNING OUTCOME
  • C. Activities and Assignments to help students
    learn LEARNING EVENTS
  • D. The key elements related to the accomplishment
    COMPONENTS

8
Language of Assessment
  • E. The objects of analysis OBJECTS
  • F. Data indicating degree of achievement
    CHARACTERISTICS
  • G. Combination of data indicating relative
    degree of achievement of the outcome INDICATORS

9
Elements of Campus Culture
  • Language / Vocabulary
  • Metaphors
  • Legends, stories, mythologies, folklore
  • Symbols
  • Rites Rituals

10
Elements of Campus Culture
  • Written Materials
  • Formal Informal Policies and Procedures
  • Organizational Structure
  • Social Knowledge
  • Reward Structure

11
C ULTURE
C L I M A T E
Maturing Assessment
BEGINNING
PROGRESS
MATURING
There is minimal evidence that the assessment
program is stable and will be sustainable
Assessment findings are beginning to be
incorporated into program reviews and the self
study of institutional effectiveness
Student learning has become central to the
institution and student learning, performance,
and achievement are celebrated
12
C ULTURE
C L I M A T E
Maturing Assessment
BEGINNING
PROGRESS
MATURING
Activities
Events
Traditions
Systems
Processes
13
C ULTURE
C L I M A T E
Maturing Assessment
BEGINNING
PROGRESS
MATURING
Activities
Events
Traditions
Tolerated Anticipated Celebrated
Isolated Connected Integrated
Systems
Processes
Periodic Episodic Characteristic
14
Distinguishing betweenClimates and Cultures
15
Climate or Culture?
  • Has the institution established an assessment
    infrastructure that is separate from the HLC
    self-study process?

16
Climate or Culture?
  • Are most non-accredited programs engaged in some
    level of assessment?

17
Climate or Culture?
  • Are there signs that the both the institution and
    academic programs are establishing patterns of
    evidence?
  • Trends instead of factoids

18
Patterns of Evidence
DP1
DP2
DP3
19
Climate or Culture?
  • Are there signs that assessment activities are
    continuing to evolve both on the institutional
    and departmental level?

20
Climate or Culture?
  • Is there a plan in place for the continued
    development of the assessment plan?

21
Climate or Culture?
  • Does the institution have an objective
    understanding of their level of implementation?

22
Evolutionary Trajectories ofAssessment
Initiatives
23
Evolutionary Trajectories
04
97
99
00
01
02
03
98
MATURING
MAKINGPROGRESS
BEGINNING
24
Maturing Assessment
BEGINNING
PROGRESS
MATURING
INSTITUTIONAL RESPONSIBILITY
DEPARTMENT RESPONSIBILITY
25
Maturing Assessment
BEGINNING
PROGRESS
MATURING
INDIRECT MEASURES
DIRECT MEASURES
26
Maturing Assessment
BEGINNING
PROGRESS
MATURING
PROCESS MEASURES
OUTCOME MEASURES
27
Maturing Assessment
BEGINNING
PROGRESS
MATURING
INSTITUTIONAL EFFECTIVENESS
STUDENT LEARNING
28
Maturing Assessment
BEGINNING
PROGRESS
MATURING
CLASSROOM ASSESSMENT
PROGRAM ASSESSMENT
29
C L I M A T E
C ULTURE
Maturing Assessment
BEGINNING
PROGRESS
MATURING
Isolated
Pervasive
Occasional
Usual
Individual
Structural
Surface
Embedded
External
Internal
Accreditation
Improvement
30
C L I M A T E
C ULTURE
Maturing Assessment
BEGINNING
PROGRESS
MATURING
LEADERSHIP
Isolated
Pervasive
KNOWLEDGE
Occasional
Usual
RESOURCES
Individual
Structural
Surface
Embedded
EXIGENCE
External
Internal
COMMITMENT
Accreditation
Improvement
31
Reflection
  • Does WCC have more characteristics of a climate
    or culture of assessment?

32
  • Levels of Assessment

33
Levels of Assessment
Classroom
Program / Discipline
College
34
Informs
Method of Analysis
Subject of Assessment
Audience
Classroom
CATS
Learning Processes
Teaching Strategy, Short Term Learning
Course Instructor
Program / Discipline
College
35
Classroom Assessment Techniques
  • One-minute paper
  • Muddiest Point
  • 5 point quiz
  • One sentence summary
  • Invented Dialogues
  • Directed Paraphrase
  • Whats the Principle?

36
Classroom Assessment
  • Formative Assessment - valuable measure of
    progress toward learning
  • Short term learning
  • Measures daily or weekly learning goals
  • Allows for analysis of effective and ineffective
    teaching strategies
  • Benefits both teachers and students

37
Classroom Assessment
  • Positively impact student learning outcomes
  • Do not allow for the demonstration of the depth,
    scope, integration, or internalization of
    learning across courses.

38
Informs
Method of Analysis
Subject of Assessment
Audience
Classroom
CATS
Learning Processes
Teaching Strategy, Short Term Learning
Course Instructor
Curriculum,Long Term Learning
Program / Discipline
Faculty Administration Accreditation
Program Outcome Measures
Achievement of Program / Discipline
Learning Outcomes
College
39
Program Assessment
  • Based on stated program level student learning
    outcomes (gen ed, program)
  • Summative - based on learning outcomes
  • Program specific, not faculty specific

40
Program Assessment
  • Requires common criteria for assessing across
    faculty, courses and across program curricula.

41
Informs
Method of Analysis
Subject of Assessment
Audience
Classroom
CATS
Learning Processes
Teaching Strategy, Short Term Learning
Course Instructor
Curriculum,Long Term Learning
Program / Discipline
Faculty Administration Accreditation
Program Outcome Measures
Achievement of Program / Discipline
Learning Outcomes
Institutional Effectiveness
Policy, Procedure, External Perception
Student Records Data Surveys
College
External Constituencies (Legislators, Parents, Ac
creditors)
42
Using Assessment to Improve Teaching and Learning
43
What Does It Mean IfEveryone in the
classflunked the exam?
44
Possible Interpretations
  • 1. The students didnt study
  • Motivation
  • Intention and Exertion

45
Motivation
  • Motivation
  • Resources
  • must exceed the Difficulty of the Task

46
Motivation
  • Motivation
  • Intention
  • What was the student trying to accomplish?
  • Passing the test?
  • Learning the material?

47
Motivation
  • Motivation
  • Exertion
  • What effort was put forth to accomplish the task?
  • Isolated burst of studying?
  • Ongoing systematic review of the material?

48
Exertion without Intention
49
Intention without Exertion
50
Intention and Exertion
51
Possible Interpretations
  • 1. The students didnt study
  • Motivation
  • Intention and Exertion
  • 2. The students didnt have What it takes to
    pass the exam
  • Abilities
  • Knowledge
  • Resources

52
Possible Interpretations
  • 1. The students didnt study
  • Motivation
  • Intention and Exertion
  • 2. The students didnt have What it Takes to
    pass the exam
  • Abilities
  • Knowledge
  • Resources
  • 3. The test was too difficult

53
The Difficulty of the Test
  • Was it possible to pass the test?
  • Was there enough time available to students to
    complete the exam?
  • Were the instructions clear?
  • Did the exam cover the material the students
    expected it to cover?
  • Was the cognitive level of the exam appropriate?

54
What Does It Mean IfEveryone in the
classflunked the exam?
In all three sections of the course?
55
Possible Interpretations
  • 1. The test is too difficult

56
The Difficulty of the Test
  • Was it possible to pass the test?
  • Was there enough time available to students to
    complete the exam?
  • Were the instructions clear?
  • Did the exam cover the material the students
    expected it to cover?
  • Was the cognitive level of the exam appropriate?

57
Possible Interpretations
  • 1. The test is too difficult
  • 2. The teaching methods didnt connect with
    students

58
Seven Principles for Good Practice
  • Student - Faculty Interaction
  • Active Learning
  • Cooperative Learning
  • Time on Task
  • Prompt Feedback
  • High Expectations
  • Respect for Diverse Talents and Ways of Learning

59
Possible Interpretations
  • 1. The test is too difficult
  • 2. The teaching methods didnt connect with
    students
  • 3. Students didnt have enough opportunities to
    practice knowledge / skills

60
Learning Events
  • Assignments
  • Feedback on practice
  • Self evaluation
  • Peer evaluation
  • Role Play
  • Pre Tests
  • Simulation

61
Possible Interpretations
  • 1. The test is too difficult
  • 2. The teaching methods didnt connect with
    students
  • 3. Students didnt have enough opportunities to
    practice knowledge / skills
  • 4. The expected learning was not supported by
    the curriculum

62
Course and Program Level Outcomes
  • Program-level learning outcomes need to be
    anchored in individual courses in the curriculum
  • Course syllabi should outline the specific course
    learning outcomes, and also the program level
    learning outcomes supported by the course

63
Student Learning Outcomes
Course 1
Course 2
Course 3
Course 4
Course 5
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
64
Possible Interpretations
  • 5. The format of the assessment wasnt a good
    fit with the outcome being assessed
  • Direct Measures

65
Learning Objects
  • Multiple choice test, abstract, advertisement,
    annotated bibliography, biography, briefing,
    brochure, budget, care plan, case analysis,
    chart, cognitive map, court brief, debate,
    definition, description, diagram, dialogue,
    diary, essay, executive summary, exam, flow
    chart, group discussion, instruction manual,
    inventory, lab notes, letter to the editor,
    matching test, mathematical problem, memo, micro
    theme, narrative, news story, notes, oral report,
    outline, performance review, plan, precis,
    presentation, process analysis, proposal,
    regulation, research proposal, review of
    literature, taxonomy, technical report, term
    paper, thesis, word problem, work of art.
    (Walvoord / Anderson 1998).

66
Assessment may help shift the campus focus from
teaching - centered to learning centered
67
T E A C H E R C E N T E R E D
L E A R N E R C E N T E R E D
Focus
Knowledge is communicated from professor to
student
Students construct knowledge through gathering
and synthesizing information
Students passively receive information
Students are actively involved
Emphasis on acquisition of knowledge outside the
context in which it will be used
Emphasis on using and communicating knowledge
to address enduring issues
Professors role is to be primary information
giver and evaluator
Professors role is to coach and facilitate
Teaching and assessing are separate
Teaching and assessing are intertwined
68
T E A C H E R C E N T E R E D
L E A R N E R C E N T E R E D
Focus
Assessment is used to monitor learning
Assessment is used to promote and diagnose
learning
Emphasis is on the right answers
Emphasis is on generating better questions and
learning from errors
Desired learning is assessed indirectly
Desired learning is assessed directly
Focus is on a single discipline
Approach is compatible with interdisciplinary
learning
Culture is competitive and individualistic
Culture is collaborative, cooperative, supportive
69
Reflection
  • What elements of the teacher-centered focus are
    embedded in the culture of the WCC campus?
  • What elements of the learner-centered focus are
    embedded in the culture of the WCC campus?

70
ASSESSMENT RESULTS INFORM
Teaching strategy Course offerings Curriculum
structure Course syllabi Co-curriculum Support
services
71
Assessment Results
  • Consensus
  • Mean results for a population (cohort)
  • Consistency
  • Practice or evidence over time (same student or
    cohort)
  • Distinctiveness
  • Individual student or cohort practice in
    different situations

72
Consensus
  • Comparison to or among groups of students
  • Variation between disciplines, gender, other
    demographic variables
  • Key questions
  • Do other people act or perform this way?
  • What is the general feeling, outcome, attitude,
    behavior?

73
Consensus
How are our students performing on the gen ed
outcome exam?
High Performance
Low Performance
P1
P2
P3
P4
P5
P6
74
Consensus
How are our students performing on the gen ed
outcome exam?
High Performance
Low Performance
P1
P2
P3
P4
P5
P6
75
Consistency
  • Examines the same practice of and individual or
    group over time
  • Key question
  • Has this person or group acted or performed this
    way in similar situations previously?

76
Consistency
How are our students performing on the program
outcome assessment?
High Performance
Low Performance
04
98
99
00
01
02
03
77
Consistency
How are our students performing on the program
outcome assessment?
High Performance
Low Performance
98
99
00
01
02
03
04
78
Distinctiveness
  • Examines individual or cohort behavior in
    different situations
  • lower division / upper division
  • small sections / large sections
  • Key Question
  • Does a person or group act distinctively based
    upon the situation?

79
Distinctiveness
How are our students performing on the program
outcome measures ?
T E C H S K I L L S
E T H I C S
L I T E R A T U R E
T H E O R I E S
M E T H O D S
High Performance
W R I T I N G
Low Performance
80
Distinctiveness
How are our students performing on the program
outcome measures ?
T E C H S K I L L S
L I T E R A T U R E
W R I T I N G
T H E O R I E S
High Performance
M E T H O D S
E T H I C S
Low Performance
81
Big Mistakes in Assessment
82
Big Mistakes in Assessment
  • Assuming that it will go away
  • Poorly written program level student learning
    outcomes

83
Learning Outcome Rules
  • Use one cognitive level
  • Focus on outcomes, not processes (what, not how)
  • List single accomplishments
  • Do not indicate level of quality (effective)
  • Allow for multiple learning objects

84
Example 1
  • Gather factual information and apply it to a
    given problem in a manner that is relevant,
    clear, comprehensive, and conscious of possible
    bias in the information selected
  • BETTER Students will be able to apply factual
    information to a problem
  • COMPONENTS
  • Relevance
  • Clarity
  • Comprehensiveness
  • Aware of Bias

85
Example 2
  • Imagine and seek out a variety of possible goals,
    assumptions, interpretations, or perspectives
    which can give alternative meanings or solutions
    to given situations or problems
  • BETTER Students will be able to provide
    alternative solutions to situations or problems
  • COMPONENTS
  • Variety of assumptions, perspectives,
    interpretations
  • Analysis of comparative advantage

86
Example 3
  • Formulate and test hypotheses by performing
    laboratory, simulation, or field experiments in
    at least two of the natural science disciplines
    (one of these experimental components should
    develop, in greater depth, students laboratory
    experience in the collection of data, its
    statistical and graphical analysis, and an
    appreciation of its sources of error and
    uncertainty)
  • BETTER Students will be able to test
    hypotheses.
  • COMPONENTS
  • Data collection
  • Statistical Analysis
  • Graphical Analysis
  • Identification of sources of error

87
Big Mistakes in Assessment
  • Demanding statistical research standards
  • Doing it for accreditation instead of improvement
  • Not sharing common definitions of program level
    student learning outcomes

88
teacher5
teacher5
teacher2
teacher1
teacher3
Speaking
volume
eye contact
gestures
sources
transitions
poise
style
rate
examples
verbal variety
conclusion
appearance
evidence
organization
attention getter
89
Can our students deliver an effective Public
Speech?
volume
eye contact
gestures
sources
transitions
poise
style
rate
examples
verbal variety
conclusion
appearance
evidence
organization
attention getter
90
Big Mistakes in Assessment
  • Confusing institutional effectiveness with
    student learning
  • Making assessment the responsibility of one
    individual
  • Assuming collecting data is Doing Assessment

91
Big Mistakes in Assessment
  • Expecting to get it right the first time

92
Cultures of Assessment Promoting Teaching and
Learning
  • Washtenaw Community College
  • August 2004
  • Susan Hatfield
  • Winona State University
  • SHatfield_at_winona.edu
Write a Comment
User Comments (0)
About PowerShow.com