Assessment 201--Beyond the Basics Assessing Program Level Assessment Plans - PowerPoint PPT Presentation

1 / 128
About This Presentation
Title:

Assessment 201--Beyond the Basics Assessing Program Level Assessment Plans

Description:

Assessment 201--Beyond the Basics Assessing Program Level Assessment Plans University of North Carolina Wilmington Susan Hatfield Winona State University – PowerPoint PPT presentation

Number of Views:552
Avg rating:3.0/5.0
Slides: 129
Provided by: SusanHa1
Category:

less

Transcript and Presenter's Notes

Title: Assessment 201--Beyond the Basics Assessing Program Level Assessment Plans


1
Assessment 201--Beyond the BasicsAssessing
Program Level Assessment Plans
  • University of North Carolina
  • Wilmington
  • Susan Hatfield
  • Winona State University
  • Shatfield_at_winona.edu

2
Outline
  • Language of Assessment
  • Assessing Assessment Plans
  • Mistakes to Avoid

3
Language of Assessment
4
Language of Assessment
  • A. General skill or knowledge category
  • GOAL
  • B. Specific accomplishments to be achieved
    OUTCOME
  • C. Activities and Assignments to help students
    learn LEARNING EVENTS
  • D. The key elements related to the accomplishment
    of the outcome COMPONENTS

5
Language of Assessment
  • E. The objects of analysis OBJECTS
  • F. Data indicating degree of achievement
    CHARACTERISTICS
  • G. Combination of data indicating relative
    degree of achievement of the learning outcome
    INDICATORS

6
Goals
7
Goals
  • Organizing Principle
  • Category or Topic Area
  • Subjects

8
Learning Outcomes
9
Student Learning Outcomes
Goal
Outcome
Outcome
Outcome
Outcome
Outcome
10
Learning Events
11
Learning Events
  • Assignments (in class and out of class)
  • Feedback on practice
  • Self evaluation
  • Peer evaluation
  • Role Play
  • Pre Tests
  • Simulation

12
Learning Objects
13
Student Learning Outcomes
Goal
Outcome
Outcome
Outcome
Outcome
Outcome
Learning events
Object
14
Components
15
Student Learning Outcomes
Goal
Outcome
Outcome
Outcome
Outcome
Outcome
component
Object
Evaluative elements
component
component
16
Performance Characteristics
17
Student Learning Outcomes
Goal
Outcome
Outcome
Outcome
Outcome
Outcome
Evaluative criteria
component
Object
component
component
Characteristics
18
Indicators
19
Student Learning Outcomes
GOAL
Outcome
Outcome
Outcome
Outcome
Outcome
Degree to which outcome is achieved
component
Object
component
indicator
component
component
20
Assessing Program Level Assessment Plans
21
Overall Questions
22
Overall Questions
  • Why are we doing assessment?
  • Who is responsible for assessment in your
    program?

23
Assessing Program Level Assessment Plans
  • Student Learning Outcomes
  • Assessment Methods
  • Implementation Strategy
  • Data Interpretation

24
Student LearningOutcomes
25
Assessing StudentLearning Outcomes
  • Has the program identified program level student
    learning outcomes?

26
Assessing Student Learning Outcomes
  • Reasonable number?
  • Tied to university mission / goals?
  • Appropriate format?
  • Do faculty and students understand what the
    outcomes mean? (components)
  • Is there a common language for describing student
    performance? (characteristics)

27
Student Learning Outcomes
  • Supported by core courses in the curriculum?
  • Are the student level learning outcomes developed
    throughout the curriculum?

28
Assessing Student Learning Outcomes
  • Reasonable number?
  • Tied to university mission / goals?
  • Appropriate format?
  • Do faculty and students understand what the
    outcomes mean? (components)
  • Is there a common language for describing student
    performance? (characteristics)

29
Professional Skills
Problem Solving
Communication
Research Methods
speak
write
cooperate
relate
30
Assessing Student Learning Outcomes
  • Reasonable number?
  • Tied to university mission / goals?
  • Appropriate format?
  • Do faculty and students understand what the
    outcomes mean? (components)
  • Is there a common language for describing student
    performance? (characteristics)

31
UNC-W Mission
  • Intellectual curiosity
  • Imagination
  • Critical thinking
  • Thoughtful expression
  • Diversity
  • International perspectives
  • Service

32
Assessing Student Learning Outcomes
  • Reasonable number?
  • Tied to university mission / goals?
  • Appropriate format?
  • Do faculty and students understand what the
    outcomes mean? (components)
  • Is there a common language for describing student
    performance? (characteristics)

33
Student Learning Outcomes
  • Students should be able to
  • ltltaction verbgtgt ltltsomethinggtgt

34
Associate Classify Compare Compute Contrast Differ
entiate Discuss Distinguish Estimate Explain Expre
ss Extrapolate Interpolate Locate Predict Report R
estate Review Tell Translate
Analyze Appraise Calculate Categorize Classify Com
pare Debate Diagram Differentiate Distinguish Exam
ine Experiment Identify Inspect Inventory Question
Separate Summarize Test
Arrange Assemble Collect Compose Construct Create
Design Formulate Integrate Manage Organize Plan Pr
epare Prescribe ProducePropose Specify Synthesize
Write
Appraise Assess Choose Compare Criticize Determine
Estimate Evaluate Grade Judge Measure Rank Rate R
ecommend Revise Score Select Standardize Test Vali
date
Cite Count Define Draw Identify List Name Point Qu
ote Read Recite Record Repeat Select State Tabulat
e Tell Trace Underline
Apply Calculate Classify Demonstrate Determine Dra
matize Employ Examine Illustrate Interpret Locate
Operate Order Practice Report Restructure Schedule
Sketch Solve Translate Use Write
35
Student Learning Outcomes
  • Learner Centered
  • Specific
  • Action oriented
  • Cognitively appropriate for the program level

36
Student Learning Outcomes
  • Students should be able to critically comprehend,
    interpret, and evaluate written, visual, and
    aural material.

37
Student Learning Outcomes
  • Students will recognize, analyze, and interpret
    human experience in terms of personal,
    intellectual, and social contexts.

38
Student Learning Outcomes
  • Students will demonstrate analytical, creative,
    and evaluative thinking in the analysis of
    theoretical or practical issues.

39
Student Learning Outcomes
  • Learner Centered
  • Specific
  • Action oriented
  • Cognitively appropriate for the program level

40
COMPREHENSION
EVALUATION
APPLICATION
ANALYSIS
SYNTHESIS
KNOWLEDGE
Associate Classify Compare Compute Contrast Differ
entiate Discuss Distinguish Estimate Explain Expre
ss Extrapolate Interpolate Locate Predict Report R
estate Review Tell Translate
Analyze Appraise Calculate Categorize Classify Com
pare Debate Diagram Differentiate Distinguish Exam
ine Experiment Identify Inspect Inventory Question
Separate Summarize Test
Arrange Assemble Collect Compose Construct Create
Design Formulate Integrate Manage Organize Plan Pr
epare Prescribe ProducePropose Specify Synthesize
Write
Appraise Assess Choose Compare Criticize Determine
Estimate Evaluate Grade Judge Measure Rank Rate R
ecommend Revise Score Select Standardize Test Vali
date
Cite Count Define Draw Identify List Name Point Qu
ote Read Recite Record Repeat Select State Tabulat
e Tell Trace Underline
Apply Calculate Classify Demonstrate Determine Dra
matize Employ Examine Illustrate Interpret Locate
Operate Order Practice Report Restructure Schedule
Sketch Solve Translate Use Write
Lower division course outcomes
41
COMPREHENSION
EVALUATION
APPLICATION
ANALYSIS
SYNTHESIS
KNOWLEDGE
Associate Classify Compare Compute Contrast Differ
entiate Discuss Distinguish Estimate Explain Expre
ss Extrapolate Interpolate Locate Predict Report R
estate Review Tell Translate
Analyze Appraise Calculate Categorize Classify Com
pare Debate Diagram Differentiate Distinguish Exam
ine Experiment Identify Inspect Inventory Question
Separate Summarize Test
Arrange Assemble Collect Compose Construct Create
Design Formulate Integrate Manage Organize Plan Pr
epare Prescribe ProducePropose Specify Synthesize
Write
Appraise Assess Choose Compare Criticize Determine
Estimate Evaluate Grade Judge Measure Rank Rate R
ecommend Revise Score Select Standardize Test Vali
date
Cite Count Define Draw Identify List Name Point Qu
ote Read Recite Record Repeat Select State Tabulat
e Tell Trace Underline
Apply Calculate Classify Demonstrate Determine Dra
matize Employ Examine Illustrate Interpret Locate
Operate Order Practice Report Restructure Schedule
Sketch Solve Translate Use Write
Upper division Course / Program outcomes
42
Assessing Student Learning Outcomes
  • Reasonable number?
  • Tied to university mission / goals?
  • Appropriate format?
  • Do faculty and students understand what the
    outcomes mean? (components)
  • Is there a common language for describing student
    performance? (characteristics)

43
Components
  • Define student learning outcomes
  • Provide a common language for describing student
    learning
  • Must be outcome specific
  • Must be shared across faculty
  • Number of components will vary by outcome

44
Communication
Relate
Speak
Listen
Participate
Write
Component
Component
Component
Component
Component
Component
Component
Component
Component
Component
Component
Component
Component
Component
Component
45
Components
Goal
Outcome
Outcome
Outcome
Outcome
Outcome
component
Evaluative elements
component
component
46
Components
Communication
Speak
Relate
Listen
Participate
Write
mechanics
Lab report
style
organization
47
The Reality of Assessing Student Learning Outcomes
  • Why you need common components

48
Course
Course
Course
Course
Course
Speaking
volume
eye contact
gestures
sources
transitions
poise
style
rate
examples
verbal variety
organization
appearance
evidence
organization
attention getter
49
Can our students deliver an effective Public
Speech?
volume
eye contact
gestures
sources
transitions
poise
style
rate
examples
verbal variety
conclusion
appearance
evidence
organization
attention getter
50
a little quiz
51
Example 1
  • Gather factual information and apply it to a
    given problem in a manner that is relevant,
    clear, comprehensive, and conscious of possible
    bias in the information selected
  • BETTER Students will be able to apply factual
    information to a problem
  • COMPONENTS
  • Relevance
  • Clarity
  • Comprehensiveness
  • Aware of Bias

52
Example 2
  • Imagine and seek out a variety of possible goals,
    assumptions, interpretations, or perspectives
    which can give alternative meanings or solutions
    to given situations or problems
  • BETTER Students will be able to provide
    alternative solutions to situations or problems
  • COMPONENTS
  • Variety of assumptions, perspectives,
    interpretations
  • Analysis of comparative advantage

53
Example 3
  • Formulate and test hypotheses by performing
    laboratory, simulation, or field experiments in
    at least two of the natural science disciplines
    (one of these experimental components should
    develop, in greater depth, students laboratory
    experience in the collection of data, its
    statistical and graphical analysis, and an
    appreciation of its sources of error and
    uncertainty)
  • BETTER Students will be able to test
    hypotheses.
  • COMPONENTS
  • Data collection
  • Statistical Analysis
  • Graphical Analysis
  • Identification of sources of error

54
Assessing Student Learning Outcomes
  • Reasonable number?
  • Tied to university mission / goals?
  • Appropriate format?
  • Do faculty and students understand what the
    outcomes mean? (components)
  • Is there a common language for describing student
    performance? (characteristics)

55
Performance Characteristics
  • Scale or description for assessing each of the
    components
  • Two to Five-point scales for each component
  • Anchored with descriptions and supported by
    examples

56
Performance Rubric
Performance Characteristics
Exceeds Expectations
Meets Expectations
Does not meet Expectations
Components
57
Communication
Speak in public situations
Verbal Delivery
Nonverbal Delivery
Structure
Evidence
58
Performance Characteristics
  • Level or degree
  • Accurate, Correct
  • Depth, Detail
  • Coherence, Flow
  • Complete, Thorough
  • Integration
  • Creative, Inventive
  • Evidence based, supported
  • Engaging, enhancing

59
Performance CharacteristicsDescription Anchors
  • Missing - Included
  • Inappropriate - Appropriate
  • Incomplete - Complete
  • Incorrect - Partially Correct - Correct
  • Vague - Emergent - Clear
  • Marginal - Acceptable - Exemplary
  • Distracting - Neutral - Enhancing
  • Usual - Unexpected - Imaginative
  • Ordinary - Interesting - Challenging

60
Performance CharacteristicsDescription Anchors
  • Simple - More fully developed - Complex
  • Reports - Interprets - Analyzes
  • Basic - Expected - Advanced
  • Few - Some - Several - Many
  • Isolated - Related - Connected - Integrated
  • Less than satisfactory - satisfactory - more than
    satisfactory - outstanding
  • Never - Infrequently - Usually - Always

61
Communication
Speak in public situations
Verbal Delivery
Several - Some - Few fluency problems
Nonverbal Delivery
Distracting - Enhancing
Structure
Disconnected - Connected - Integrated
Evidence
Doesnt support - Sometimes - Always supports
62
Rubric Resource
  • www.winona.edu/air/rubrics.htm

63
Assessing Student Learning Outcomes
  • Outcomes supported by core courses in the
    curriculum?
  • Are the student level learning outcomes developed
    throughout the curriculum?

64
Student Learning Outcomes
Course 1
Course 2
Course 3
Course 4
Course 5
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
65
Assessing Student Learning Outcomes
  • Supported by core courses in the curriculum?
  • Orphan outcomes?
  • Empty requirements?

66
Student Learning Outcomes
Course 1
Course 2
Course 3
Course 4
Course 5
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
67
Student Learning Outcomes
Course 1
Course 2
Course 3
Course 4
Course 5
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
68
Student Learning Outcomes
  • Supported by core courses in the curriculum?
  • Orphan outcomes?
  • Empty requirements?
  • Developed through the curriculum?
  • Knowledge / Comprehension
  • Application / Analysis
  • Synthesis / Evaluation

69
Student Learning Outcomes
Course 1
Course 2
Course 3
Course 4
Cluster 1
K
A
S
K
A
S
A
A
S
K
A
S
K
A
S
K
A
S
K
A
S
70
Assessment Methodology
71
Assessing Assessment Methodology
  • Does the assessment plan measure student
    learning, or program effectiveness?
  • Does the plan rely on direct measures of student
    learning?
  • Do the learning objects match the outcomes?

72
Assessing Assessment Methodology
  • Is there a systematic approach to implementing
    the plan?

73
Assessing Assessment Methodology
  • Does the assessment plan measure student
    learning, or program effectiveness?
  • Does the plan rely on direct measures of student
    learning?
  • Do the learning objects match the outcomes?

74
Assessment of Program Effectiveness
  • What the program will do or achieve
  • Curriculum
  • Retention
  • Graduation
  • Placement
  • Satisfaction (graduate and employer)

75
Assessment of Student Learning Outcomes
  • What students will do or achieve
  • Examines actual student work

76
Assessing Assessment Methodology
  • Does the assessment plan measure student
    learning, or program effectiveness?
  • Does the plan rely on direct measures of student
    learning?
  • Do the learning objects match the outcomes?

77
Direct Measures of Student Learning
  • Capstone experiences
  • Standardized tests
  • Performance on national licensure certification
    or professional exams
  • Locally developed tests
  • Essay questions blind scored by faculty
  • Juried review of senior projects
  • Externally reviewed exhibitions performances
  • Evaluation of internships based upon program
    learning outcomes

78
Indirect Measures of Learning
  • Alumni, employer, and student surveys (including
    satisfaction surveys)
  • Exit interviews of graduates and focus groups
    graduate follow up studies
  • Retention and transfer studies
  • Length of time to degree
  • ACT scores
  • Graduation and transfer rates
  • Job placement rates

79
Non-Measures of Student Learning
  • Curriculum review reports
  • Program review reports from external evaluators
  • Faculty publications and recognition
  • Course enrollments and course profiles
  • Faculty / student ratios, percentage of students
    who study abroad
  • Enrollment trends
  • 5 year graduation rates
  • Diversity of the student body

80
Assessing Assessment Methodology
  • Does the assessment plan measure student
    learning, or program effectiveness?
  • Does the plan rely on direct measures of student
    learning?
  • Do the learning objects match the outcomes?

81
Learning Objects
  • Standardized Exam, abstract, advertisement,
    annotated bibliography, biography, briefing,
    brochure, budget, care plan, case analysis,
    chart, cognitive map, court brief, debate,
    definition, description, diagram, dialogue,
    diary, essay, executive summary, exam, flow
    chart, group discussion, instruction manual,
    inventory, lab notes, letter to the editor,
    matching test, mathematical problem, memo, micro
    theme, multiple choice test, narrative, news
    story, notes, oral report, outline, performance
    review, plan, precis, presentation, process
    analysis, proposal, regulation, research
    proposal, review of literature, taxonomy,
    technical report, term paper, thesis, word
    problem, work of art. (Walvoord Anderson 1998).

82
Matching Outcomes to Objects
  • Logical decision
  • Demonstrate components
  • ? Individual vs. group assignment
  • ? Amount of feedback provided in preparation of
    the assignment
  • ? Timing / program / semester

83
Assessing Assessment Methodology
  • Is there a systematic approach to implementing
    the plan?
  • Not every outcome in every course by every
    faculty every semester
  • Identified Assessment Points?

84
Student Learning Outcomes
Course 1
Course 2
Course 3
Course 4
Course 5
K
A
S
K
A
S
A
A
S
S
K
A
K
A
S
A
S
K
A
S
K
85
ImplementationStrategy
86
Assessing Implementation Strategy
  • Is the assessment plan being phased in?
  • Is there a systematic method for assessing
    student learning objects?
  • Is there a method for collecting and organizing
    data?

87
Assessing Implementation Strategy
  • Is the assessment plan being phased in?
  • Is there a systematic method for assessing
    student learning objects?
  • Is there a method for collecting and organizing
    data?

88
Phase 4
Student Learning Outcomes
Course 1
Course 2
Course 3
Course 4
Course 5
K
A
S
K
A
S
A
A
S
S
K
A
K
A
S
A
S
K
A
S
K
89
Assessment Methodology
  • Is there a systematic method for assessing
    student learning objects?

90
Scoring Option 1
  • Course instructor
  • Collects, assesses, and provides aggregate data
  • Assessment committee (program)
  • Consolidates and reports data to internal and
    external constituencies
  • Community
  • Discusses data, implications, priorities and
  • (if necessary) plans for improvement

91
Scoring Option 2
  • Course instructor
  • Collects assignment
  • Assessment committee (program)
  • Samples, evaluates, analyzes and reports data to
    internal and external constituencies
  • Community
  • Discusses data, implications, priorities and
  • (if necessary) plans for improvement

92
Assessing Implementation Strategy
  • Is there a method for collecting and organizing
    data?

93
Reporting Methods
  • Tear off page
  • Carbonless copy paper
  • Reporting templates
  • Paper
  • WWW

94
Communicate Effectively Demonstrate Oral
Communication Skills
Course 5
Does not meet
Meets
Exceeds
Verbal Delivery
13
65
17
7
Nonverbal Delivery
21
72
54
Organization
14
32
Evidence
Transitions
95
Communicate Effectively Demonstrate Oral
Communication Skills
PROGRAM WIDE COMPETENCY REPORT
Does not meet
Meets
Exceeds
Verbal Delivery
20
65
15
57
30
Nonverbal Delivery
13
58
Organization
24
18
Evidence
Transitions
96
Data Interpretation
97
Interpreting Data
  • First data collection in not about the data per
    se, but instead about testing the methods and
    tools.

98
Interpreting Data
  • Did the assessment method distinguish between
    levels of student performance?
  • Sampling issues?
  • Inter-rater reliability?
  • Multiple measures?
  • Were patterns in the data examined?

99
Interpreting Data
  • Who was involved in the discussion of the data?
  • What plans were made to address issues of
    concern?
  • When will a result of the changes be measurable?
  • How is the information communicated to
    constituent groups?

100
Interpreting Data
  • Did the assessment method distinguish between
    levels of student performance?
  • Sampling issues?
  • Inter-rater reliability?
  • Multiple measures?
  • Were patterns in the data examined?

101
Communicate Effectively Demonstrate Oral
Communication Skills
Course 5
Does not meet
Meets
Exceeds
Verbal Delivery
1
3
96
98
Nonverbal Delivery
0
2
2
Organization
3
95
Evidence
Transitions
102
Interpreting Data
  • Did the assessment method distinguish between
    levels of student performance?
  • Sampling issues?
  • Inter-rater reliability?
  • Multiple measures?
  • Were patterns in the data examined?

103
Interpreting Data
  • Did the assessment method distinguish between
    levels of student performance?
  • Sampling issues?
  • Inter-rater reliability?
  • Training
  • Calibration
  • Decision rules
  • Support materials

104
Interpreting Data
  • Did the assessment method distinguish between
    levels of student performance?
  • Sampling issues?
  • Inter-rater reliability?
  • Multiple measures?
  • Were patterns in the data examined?

105
Multiple Measures
DP1
DP2
DP3
106
Interpreting Data
  • Did the assessment method distinguish between
    levels of student performance?
  • Sampling issues?
  • Inter-rater reliability?
  • Multiple measures?
  • Were patterns in the data examined?

107
Data Patterns
  • Consistency
  • Consensus
  • Distinctiveness

108
Consistency
  • Examines the same practice of and individual or
    group over time
  • Key question
  • Has this person or group acted, felt, or
    performed this way in the past / over time?

109
Consistency
How well are students performing on the Outcome?
High performance
Low performance
00
01
02
03
04
05
110
Consensus
  • Comparison to or among groups of students
  • Variation between disciplines, gender, other
    demographic variables
  • Key questions
  • What is the general feeling, outcome, attitude,
    behavior?
  • Do other groups of people act, perform or feel
    this way?

111
Consensus
How well are students performing on the Outcome?
High performance
Low performance
Females
Males
Transfer
OTA
112
Distinctiveness
  • Examines individual or cohort perspectives across
    different situations, categories
  • Key Question
  • Does a person or group respond differently based
    upon the situation, item, issue?

113
Distinctiveness
How well are students performing on the Outcome?
High Performance
C O N T E N T
O R G A N I Z A T I O N
M E C H A N I C S
Low Performance
114
Interpreting Data
  • Who was involved in the discussion of the data?
  • What plans were made to address issues of
    concern?
  • When will a result of the changes be measurable?
  • How is the information communicated to
    constituent groups?

115
Discussion Questions
  • Based upon your experience, does the data
    surprise you?

116
Discussion Questions
  • Does our students performance on the outcome
    meet our expectations?

117
Discussion Questions
  • How can this data be validated? What else can we
    do to find out if this is accurate?

118
Interpreting Data
  • Who was involved in the discussion of the data?
  • What plans were made to address issues of
    concern?
  • When will a result of the changes be measurable?
  • How is the information communicated to
    constituent groups?

119
Addressing Issues
  • Learning Opportunities
  • New
  • Revised
  • Feedback
  • Curriculum
  • Faculty Development
  • Student Development

120
Interpreting Data
  • Who was involved in the discussion of the data?
  • What plans were made to address issues of
    concern?
  • When will a result of the changes be measurable?
  • How is the information communicated to
    constituent groups?

121
How Assessment Works
Year 2
Year 3
Year 1
O U T C O M E
Compare Against Benchmarks, Standards, Past
Performance
New / Revised Event 1 New / Revised Event 2 New
/ Revised Event 3
New / Revised Event 1 New / Revised Event 2 New
/ Revised Event 3
L Event 1 L Event 2 L Event 3
Component Component Component Component BASELINE
Component Component Component Component
Component Component Component Component
122
Interpreting Data
  • Who was involved in the discussion of the data?
  • What plans were made to address issues of
    concern?
  • When will a result of the changes be measurable?
  • How is the information communicated to
    constituent groups?

123
Communicating Assessment
  • Poster Sessions
  • Brown Bag Sessions
  • WWW sites
  • Meetings
  • Newsletters

124
Really Big Mistakes
125
Big Mistakes in Assessment
  • Assuming that it will go away
  • Allowing assessment planning to become gaseous
  • Assuming you got it right -- or expecting to get
    it right -- the first time
  • Not considering implementation issues when
    creating plans

126
Big Mistakes in Assessment
  • Borrowing plans and methods without acculturation
  • Setting the bar too low
  • Assuming that youre done and everythings OK, or
    rushing to Close the Loop
  • Doing it for accreditation instead of improvement

127
Big Mistakes in Assessment
  • Confusing program effectiveness with student
    learning
  • Making assessment the responsibility of one
    individual
  • Assuming collecting data is Doing Assessment

128
Assessment 201--Beyond the BasicsAssessing
Program Level Assessment Plans
  • University of North Carolina
  • Wilmington
  • Susan Hatfield
  • Winona State University
  • Shatfield_at_winona.edu
Write a Comment
User Comments (0)
About PowerShow.com