Assessing Information Literacy How University Libraries Can Contribute to the Measurement of Student - PowerPoint PPT Presentation

1 / 46
About This Presentation
Title:

Assessing Information Literacy How University Libraries Can Contribute to the Measurement of Student

Description:

Embed assessment in the curriculum via an online tutorial, LOBO. What is LOBO? ... Based on LOBO Outcomes (derived from ACRL/AAHE IL Standards) ... – PowerPoint PPT presentation

Number of Views:52
Avg rating:3.0/5.0
Slides: 47
Provided by: Syst6
Category:

less

Transcript and Presenter's Notes

Title: Assessing Information Literacy How University Libraries Can Contribute to the Measurement of Student


1
Assessing Information LiteracyHow University
Libraries Can Contribute to the Measurement of
Student Skills
  • 2004 Assessment Institute
  • November 2, 2004

2
Overview
  • Past Approaches to Information Literacy (IL)
    Assessment
  • Self-Report Measures
  • Standardized Tests
  • A Different Approach to IL Assessment
  • Transforming Professional Standards
  • Using Rubrics
  • Applying Rubric Assessment to an IL Tutorial
  • Results Future Plans

3
Past Approaches to Assessment of Information
Literacy
  • Self-Report Measures
  • Standardized Assessment of Information Literacy
    Skills (SAILS)

4
Self-Report Measures
  • The pace of the instruction was just right for
    me.
  • __ Strongly agree
  • __ Agree
  • __ Neutral
  • __ Disagree
  • __ Strongly disagree

http//www.library.ubc.ca/home/forms/studentevalfo
rm.html
What can you discover about student learning from
the answer to this question?
5
Self-Report Measures
  • The library instructor was knowledgeable and
    helpful.
  • __ Strongly agree
  • __ Agree
  • __ Neutral
  • __ Disagree
  • __ Strongly disagree

http//www.library.ubc.ca/home/forms/studentevalfo
rm.html
What can you discover about student learning from
the answer to this question?
6
SAILS
  • Standardized Assessment of Information Literacy
    Skills
  • Multiple-choice test.
  • 30 questions, delivered on the Web.
  • Purpose
  • Program evaluation
  • Cross-institutional comparison.

http//sails.lms.kent.edu/publications/aahe_files/
frame.htm
7
Why this doesnt work for us
  • Does not adapt unwieldy IL Standards to a
    manageable instructional context.
  • Validity reliability not yet demonstrated.
  • Multiple-choice, not performance focused.
  • Seems to focus on lower-end thinking skills.
  • Not testing what is taught.
  • Too far removed from instruction
  • to be useful in closing the loop.

8
Why Focus on Evaluating Direct Forms of Student
Learning?
  • Indirect forms of student learning dont always
    help you understand where you can make
    improvements in your program.
  • They dont always tell you how your program
    contributes to student development and learning.
  • Multiple forms of methods provide you more
    evidence to make a more informed decision.
  • Direct methods help you improve programs while
    you are still delivering them (e.g., formative)

9
Why Focus on Evaluating Student Learning?
  • The concepts of learning, personal development,
    and student development are inextricably
    intertwined and inseparable. The Student
    Learning Imperative
  • Good assessment is based fundamentally on
    collaboration among colleagues. And since
    student learning takes place both inside and
    outside the classroom, some of the most
    interesting and intellectually exciting work in
    assessment involves collaboration among faculty
    and student affairs professionals. -Banta et
    alia.

10
Why Focus on Evaluating Student Learning?, Cont.
  • As resources decline and the competition for
    resources within institutions increases, every
    program and service must demonstrate its
    importance and worth. - Upcraft and Schuh
  • advances in the study of thinking and learning
    (cognitive science) and in the field of
    measurement have stimulated people to think in
    new ways about how students learn and what they
    know, what is therefore worth assessing, and how
    to obtain useful information about student
    competencies. - National Research Council

11
Why Focus on Evaluating Student Learning?, Cont.
  • To assure that students have sufficient and
    various kinds of educational opportunities to
    learn or develop desired outcomes, faculty and
    staff often engage in curricular and
    co-curricular mapping Peggy L. Maki
  • Regional and some Professional Accreditation
    Agencies
  • AACU Greater Expectations
  • NASPA Learning Reconsidered

12
A Different Approach to Information Literacy
Assessment
  • Needs
  • To assess IL outcomes in a meaningful
    contextualized way.
  • To integrate assessment into course curriculum.
  • To fold the assessment process into regular
    workflow.
  • To end up with data we could use to close the
    loop and
  • impress our friends! ?

13
Our Vehicle
  • Embed assessment in the curriculum via an online
    tutorial, LOBO.
  • What is LOBO?
  • Benefits to this approach
  • Student motivation
  • Curricular fidelity
  • Potential relationship to general education
    assessment
  • Cross-campus collaboration commitment
  • Campus visibility
  • Needed an assessment plan!

14
The IterativeSystematicAssessment
CycleAdapted from Peggy Maki, Ph.D. by
Marilee J. Bresciani, Ph.D.
Gather Evidence
Interpret Evidence
Mission/Purposes Goals Outcomes
Implement Methods to Deliver Outcomes and
Methods to Gather Evidence
Make decisions to improve programs enhance
student learning and development inform
institutional decision- making, planning,
budgeting, policy, public accountability
15
(No Transcript)
16
TransformingProfessional Standards
  • Work from official standards language.
  • Eliminate redundancy, reword for clarity.
  • Rewrite contextualized outcomesbut keep codes
    that allow you to trace origins.

17
(No Transcript)
18
The IterativeSystematicAssessment
CycleAdapted from Peggy Maki, Ph.D. by
Marilee J. Bresciani, Ph.D.
Gather Evidence
Interpret Evidence
Mission/Purposes Goals Outcomes
Implement Methods to Deliver Outcomes and
Methods to Gather Evidence
Make decisions to improve programs enhance
student learning and development inform
institutional decision- making, planning,
budgeting, policy, public accountability
19
Questions prompt students to demonstrate their
achievement of outcomes.
20
Answers to questions are transferred to a
printable worksheet and saved to a database.
21
The IterativeSystematicAssessment
CycleAdapted from Peggy Maki, Ph.D. by
Marilee J. Bresciani, Ph.D.
Gather Evidence
Interpret Evidence
Mission/Purposes Goals Outcomes
Implement Methods to Deliver Outcomes and
Methods to Gather Evidence
Make decisions to improve programs enhance
student learning and development inform
institutional decision- making, planning,
budgeting, policy, public accountability
22
What is a Rubric?
  • A rubric is "a set of criteria and a scoring
    scale that is used to assess and evaluate
    students' work. Often rubrics identify levels or
    ranks with criteria indicated for each level."
  • - - Campbell, Melenyzer, Nettles, and Wyman,
    2000

23
Why Use a Rubric
  • Provide evaluators and those whose work is being
    evaluated with rich and detailed descriptions of
    what is being learned and what is not
  • Combats accusations that evaluator does not know
    what he/she is looking for in learning and
    development
  • Can be used as a teaching tool students and
    staff begin to understand what it is they are or
    are not learning or are or are not able to
    demonstrate what they know and can do

24
For example
  • You can use a rubric to
  • Make meaning out of national standards and
    indicators
  • Norm faculty and staffs expectations
  • Inform students of what you are looking for
  • Give students an opportunity to see how they have
    improved
  • Make rankings, ratings, and grades more
    meaningful
  • Help students identify their own learning and
    development or absence thereof
  • Assess a student, course, or a program
  • Quantify student learning

25
Using Rubrics to Evaluate Large Groups of
Students
  • You dont have to evaluate every artifact of
    learning
  • Random sample
  • Random stratified sample
  • Purposeful sample
  • Best case and worst case sample
  • Remember the point is to get an idea of how
    well students learned in order to know what to
    improve in the delivery of that learning

26
A Pilot Test Web Site Evaluation
27
(No Transcript)
28
Rubric Criteria
  • Based on LOBO Outcomes (derived from ACRL/AAHE IL
    Standards).
  • Focused on 4 areas for web site evaluation
  • Using Criteria Terminology
  • Citing Criteria Indicators
  • Citing Examples of Indicators from the Site
  • Judging Whether or Not to Use the Site

29
Levels of Performance
  • Exemplary ? Meets outcome completely.
  • What a good answer looks like.
  • Developing ? Shows progress toward meeting
  • outcome, but does not meet it completely.
  • What a medium answer looks like.
  • Beginning ? Does not meet outcome.
  • What a poor answer looks like.

30
(No Transcript)
31
(No Transcript)
32
Authority
  • 68 seemed to be aware of authority issues but
    did NOT use terms like authority,
    sponsorship, or authorship to describe it.
  • 70 could cite specific clues of web site
    authority, but only 32 gave examples from the
    site they were evaluating.
  • 44 indicated whether or not theyd use the site
    based on authority issues and said why.

33
(No Transcript)
34
Currency
  • 60 seemed to be aware of currency issues and
    used terms like currency or timeliness to
    describe it.
  • 60 could cite specific clues of web site
    currency, and 60 gave examples from the site
    they were evaluating.
  • 44 indicated whether or not theyd use the site
    based on currency issues and said why.

35
(No Transcript)
36
Bias
  • 68 seemed to be aware of bias issues and used
    terms like bias, perspective, or point of
    view to describe it.
  • 46 could cite specific clues of web site bias,
    but only 32 gave examples from the site they
    were evaluating.
  • 18 indicated whether or not theyd use the site
    based on bias issues and said why.

37
Examples of Reporting for Stakeholders
  • While 44 of students can determine whether or
    not a web site is appropriate for their purpose
    and provide a rationale for that decision based
    on authority or currency, only 18 demonstrate
    this ability based on bias.
  • (Language from LOBO Outcome 3.2.2)

38
The IterativeSystematicAssessment
CycleAdapted from Peggy Maki, Ph.D. by
Marilee J. Bresciani, Ph.D.
Gather Evidence
Interpret Evidence
Mission/Purposes Objectives/Goals Outcomes
Implement Methods to Deliver Outcomes and
Methods to Gather Evidence
Make decisions to improve programs enhance
student learning and development inform
institutional decision- making, planning,
budgeting, policy, public accountability
39
Before
40
After
More content!
41
After
More direction!
Better responses?
42
The Next Step
  • Applying for a grant
  • To test consistency of rubric approach to IL
    assessment.
  • Rater groups
  • Students
  • ENG 101 Instructors
  • Librarians
  • External Experts

43
Things to Remember when Working with Multiple
Reviewers
  • Agree on an outcome
  • Agree on method of data collection
  • Agree on the meaning for the outcome and
    definition in other words agree on how you know
    the outcome is met and what it will look like
    when you see it met
  • Agree on the systematic implementation of the
    assignments and the rubric

44
Things to Remember when Working with Multiple
Reviewers, Cont.
  • Norm the reviewers
  • Select 3-5 artifacts of learning representing
    various levels on the rubric
  • Review together and discuss differences of
    opinions
  • Tweak Rubric or adjust expectations
  • Or ask the reviewers to create artifacts that
    would be evaluated in the various levels of the
    rubric (repeat aforementioned process)

45
Things to Remember when Working with Multiple
Reviewers, Cont.
  • If reviewers help create the rubric, quite a bit
    of norming occurs in the discussion portion of
    the creation of the rubric

46
Questions?
Megan Oakleaf megan_oakleaf_at_ncsu.edu
www.lib.ncsu.edu/lobo2 Marilee
Bresciani mbresciani_at_tamu.edu
Write a Comment
User Comments (0)
About PowerShow.com