Planning For, Interpreting - PowerPoint PPT Presentation

About This Presentation
Title:

Planning For, Interpreting

Description:

Who should drive the car? Who provides the car, gas, insurance and maintenance? ... repair manual, tool kit, first aid kit, and stimulates the conversation ... – PowerPoint PPT presentation

Number of Views:14
Avg rating:3.0/5.0
Slides: 32
Provided by: ftr41
Category:

less

Transcript and Presenter's Notes

Title: Planning For, Interpreting


1
Planning For, Interpreting Using Assessment Data
  • Gary Williams, Ed.D.
  • Instructional Assessment Specialist, Crafton
    Hills College
  • gwilliams_at_craftonhills.edu
  • Fred Trapp, Ph.D.
  • Administrative Dean, Institutional
    Research/Academic Services, Long Beach City
    College
  • October, 2007
  • ftrapp_at_lbcc.edu

2
Goals of the Presentation
  • De-mystify the assessment process
  • Provide practical approaches examples for
    assessing student learning
  • Answer questions posed by attendees that pertain
    to their assessment challenges

3
Whats It All About?
  • An ongoing process aimed at understanding and
    improving student learning.
  • Faculty making learning expectations explicit and
    public.
  • Faculty setting appropriate standards for
    learning quality.

4
Whats It All About?
  • Systematically gathering, analyzing and
    interpreting evidence to determine how well
    student performance matches agreed upon faculty
    expectations standards.
  • Using results to document, explain and improve
    teaching learning performance.
  • Tom Angelo
  • AAHE Bulletin, November 1995

5
Roles of Assessment
  • We assess to assist, assess to advance, assess
    to adjust
  • Assist provide formative feedback to guide
    student performance
  • Advance summative assessment of student
    readiness for whats next
  • Adjust continuous improvement of curriculum,
    pedagogy.
  • - Ruth Stiehl,
  • The Assessment Primer Creating a Flow of
    Learning Evidence (2007)

6
Formulating Questions for Assessment
  • Curriculum designed backwards Students journey
    forward
  • What do students need to DO out there that
    were responsible for in here? (Stiehl)
  • Subsequent roles in life (work or future study,
    etc.)
  • How do students demonstrate the intended learning
    now?
  • What kinds of evidence must we collect and how do
    we collect it?

7
Assessment Questions Strategies Factors to
consider
  • Meeting Standards
  • Does the program meet or exceed certain
    standards?
  • Criterion reference, commonly state or national
    standards
  • Comparing to Others
  • How does the student or program compare to
    others?
  • Norm reference, other students, programs or
    institutions

8
Assessment Questions Strategies-Factors to
Consider
  • Measuring Goal Attainment
  • Does the student or program do a good job at what
    it sets out to accomplish?
  • Internal reference to goals and educational
    objectives compared to actual performance.
  • Formative student-center.
  • Professional judgment about evidence common.

9
Assessment Questions Strategies- Factors to
Consider
  • Developing Talent and Improving Programs
  • Has the student or program improved?
  • How can the students program and learning
    experience be improved even further?
  • Formative and developmental.
  • Variety of assessment tools and sources of
    evidence.

10
Choosing Assessment Tools
  • Depends upon the unit of analysis
  • Course
  • Program
  • Degree/general education
  • Co-curricular
  • Also depends upon overall learning expectations

11
Formulating Assessment Strategies
12
Formulating Assessment Strategies
13
Formulating Assessment Strategies
14
Direct vs. Indirect Evidence
  • Direct
  • What can the student actually do or demonstrate
    they know
  • Can witness with own eyes
  • Setting is structured/ contained
  • Indirect
  • What students say they can do
  • Focus on the learning process or environment
  • Things from which learning is inferred
  • Setting is not easily contained/structured

15
Qualitative vs. Quantitative
  • Qualitative
  • Words
  • Categorization of performance into groups
  • Broad emergent themes
  • Holistic judgments
  • Quantitative
  • Numbers
  • Individual components and scores
  • Easier calculations and comparisons plus
    presentation to a public audience

16
Formative vs. Summative
  • Assessment for learning
  • In-progress
  • Provide corrective feedback
  • Establish foundational learning for next step.
  • Assessment for evaluative purpose
  • After the fact
  • Determine progress/ achievement/proficiency
  • Readiness for next step/ role/learning experience

17
Means of Assessment-(Quantitative Judgments)
  • Cognitive
  • Standardized exams
  • Locally developed exams
  • Attitudes/beliefs
  • Opinion surveys of students, graduates, employers

18
Means of Assessment- (Qualitative Judgments)
  • Cognitive
  • Embedded classroom assignments
  • Behavior/performances (skills applications)
  • Portfolios
  • Public performances
  • Juried competitions
  • Internships
  • Simulations
  • Practical demonstrations
  • Attitudes/beliefs
  • Focus groups

19
Interpreting Results- How Good Is Good Enough?
  • Norm Referencing
  • Comparing student achievement against other
    students doing the same task
  • Criterion Referencing
  • Criteria and standards of judgment developed
    within the institution

20
Are Results Valid and Reliable?
  • Validity
  • Reliability
  • Authentic assessment
  • Important questions or easy questions
  • Inform teaching and learning?

21
How Does Assessment Data Inform Decision-Making?
  • Goal Making sound curricular and pedagogical
    decisions, based on evidence
  • Assessment questions are tied to instructional
    goals.
  • Assessment methods yield data that is valid
    reliable.
  • A variety of measures are considered.
  • Assessment is an ongoing cycle.

22
Assessment Process
23
Collaboration Among Faculty, Administration
Researchers
  • Assessment, the auto, and a road trip an
    analogy
  • Who should drive the car?
  • Who provides the car, gas, insurance and
    maintenance?
  • Who brings the maps, directions, repair manual,
    tool kit, first aid kit, and stimulates the
    conversation along the journey?

24
Why Faculty are the Drivers
  • Faculty have the primary responsibility for
    facilitating learning (delivery of instruction)
  • Faculty are already heavily involved in
    assessment (classroom, matriculation)
  • Faculty are the content experts
  • Who knows better what students should learn than
    faculty?

25
Who Provides the Car and Keeps Gas in It?
Administrators!
26
Role of Administrators
  • Establish that an assessment program is important
    at the institution
  • Ensure colleges mission and goals reflect a
    focus on student learning
  • Institutionalize the practice of data-driven
    decision making (curriculum change, pedagogy,
    planning, budget, program review)
  • Create a neutral, safe environment for dialogue

27
Where Does IR Sit in the Car?
28
Roles of Researchers
  • Serve as a resource on assessment methods
  • Assist in the selection/design and validation of
    assessment instruments
  • Provide expertise on data collection, analysis,
    interpretation, reporting, and use of results
  • Facilitate dialogue - train and explain
  • Help faculty improve their assessment efforts

29
Faculty DONTs
  • Avoid the SLO process or rely on others to do it
    for you.
  • Rely on outdated evaluation/grading models to
    tell you how your students are learning.
  • Use only one measure to assess learning
  • Dont criticize or inhibit the assessment efforts
    of others.

30
Faculty DOs...
  • Participate in SLO assessment cycle
  • Make your learning expectations explicit
  • Use assessment opportunities to teach as well as
    to evaluate.
  • Dialogue with colleagues about assessment methods
    and data.
  • Focus on assessment as a continuous improvement
    cycle.

31
Questions From the Field
Write a Comment
User Comments (0)
About PowerShow.com