Assessment for Information Literacy : theoretical and practical issues PowerPoint PPT Presentation

presentation player overlay
1 / 52
About This Presentation
Transcript and Presenter's Notes

Title: Assessment for Information Literacy : theoretical and practical issues


1
Assessment for Information Literacy theoretical
and practical issues
  • Sheila Webber, Department of Information Studies,
    University of Sheffield, UK
  • January 2004
  • Danmarks Forskningsbiblioteksforening

2
Outline
  • What do I mean by assessment?
  • Why is assessment important?
  • Four factors in assessment
  • Issues with current practice of assessment
  • Example from an information literacy class
  • Project - UK academics' conceptions of, and
    pedagogy for, information literacy (examples from
    Chemistry and English lecturers)

3
Assessment?
  • Assessing outcomes of student learning
  • Formative assessment - giving feedback and
    guidance which enables student to develop
  • Summative assessment - judging how student has
    performed, what standard achieved
  • "integral part of of facilitating learning and
    understanding" (Webber and Johnston)
  • Assessment of programmes or teachers

4
It has long been recognised that probably the
biggest influence on a students approach to
their studies is the assessment regime of the
course Rust, C. (2001 11)
5
the crucial thing, I think, is that you do have
to tie the literacy exercises to application to
the discipline which is assessed in some way,
frankly, because if not, the ones who need it
most will do it less Civil Engineering
lecturer, interviewed for our project
6
No simplistic model for IL assessment
  • Assessment in context of teaching, learning and
    course design
  • Complex assessment as befits the definition of IL

7
Designing assessment in practice
  • Common factors
  • Modes of assessment(self-, peer or expert
    assessment)expressed by
  • Tasks, activities and products of assessment
    individual and group

Bill Johnston Sheila Webber, 2002
8
4 Common factors
  • 1. Assessment should address a blend of purposes
  • Diagnosis
  • Formative feedback for improvement
  • Summative feedback for judgement
  • Course evaluation, quality audit
  • (but you may need to address different purposes
    through different exercises be clear which you
    are addressing)

Bill Johnston Sheila Webber, 2002
9
  • 2. Assessment regime should display certain
    conditions e.g.
  • relevance, consistency, authenticity,
    practicality
  • N.B it might be "practical" but meaningless!

Bill Johnston Sheila Webber, 2002
10
  • 3. Recording of assessment should take variety of
    forms e.g.
  • transcripts of test results, portfolios, learning
    diaries
  • 4. Assessment should address the learners
    concept of, approach to, learning e.g.
  • Quantitative/qualitative Surface/deep

Bill Johnston Sheila Webber, 2002
11
Complication Potential conflict in librarians'
role?
12
Conflict?
  • Educator
  • Consultant
  • Mentor
  • Facilitator
  • Change agent
  • Service role

Customer always right Demystifying, downplaying
expertise Need to justify benchmark what you do
Expert judgement Negative and positive
feedback Body of knowledge, commanding respect
13
Plus
  • Short time that librarians are "allowed" with
    students in many institutions
  • Small number of librarians serving large number
    of students
  • Academics/ managers overestimating the
    information literacy of students (i.e. not
    recognising how much students need to learn)

14
Result?
  • Concentration on
  • Purpose Justification of librarians' work
    course/teacher evaluation diagnosis. Feedback to
    student may be less than you would want
  • Conditions Consistency and practicality (rather
    than authenticity and relevance)
  • Recording quizzes and tests (rather than
    reflective accounts, portfolios etc)
  • Approaches to learning May not engage student
    deeply (following on from points above)

15
  • Be suspicious of the objectivity and accuracy of
    all measures of student ability and conscious
    that human judgment is the most important element
    in every indicator of human achievement
  • Ramsden, quoted in Biggs, J. (1999, p159).

16
  • The Bay Area Community Colleges Information
    Competency Assessment Project
  • "The Project's purpose to develop a
    challenge-out or credit-by-exam instrument that
    can be used and/or modified at community colleges
    that have an information competency requirement"
  • developed and field-tested an information
    competency assessment instrument.
  • Part A 47 multiple choice, matching, and short
    answer items
  • Part B 12 performance-based exercises
  • http//www.topsy.org/ICAP/ICAProject.html

17
  • Standardized Assessment of Information Literacy
    Skills (SAILS)
  • "to develop an instrument for programmatic level
    assessment of information literacy skills that is
    valid and thus credible to university
    administrators and other academic personnel. We
    envisioned a tool to measure information literacy
    that
  • Is standardized
  • Contains items not specific to a particular
    institution or library
  • Is easily administered
  • Has been proven valid and reliable
  • Assesses at institutional level
  • Provides for both external and internal
    benchmarking
  • Multiple choice questions based on ACRL standards
  • http//sails.lms.kent.edu/index.php

18
Quebec study
  • 3,003 students
  • 20 questions covering Concept Identification
    Search Strategy Document types Search tools
    Use of results
  • Results under 11 for 12 of these
  • Report includes detailed explanation of Qs

Mittermeyer, D. and Quirion, D (2003)
19
CAUL Information LiteracyAssessment Instrument
  • Development led by Ralph Catts
  • Self reporting questionnaire
  • Based on the Australian IL Standards
  • Example 'Stores and manages information'
  • E.g. When I research a topic I use tools such as
    endnote to organise the information'
  • http//vefir.unak.is/CKIII/speakers.htm

20
Comments
  • Some interesting work, but also a note of caution
    e.g.
  • These exercises have taken years to develop are
    time-intensive to maintain
  • Will need to be updated e.g.
  • Students become "questionnaire savvy"
  • Reflect actual utility of web etc.
  • Emphasis on multiple choice etc. may not sit
    comfortably with approaches to assessment of
    student learning in other countries
  • Best at addressing lower-order (simpler) aspects
    of information literacy

21
Examples of different approaches
  • From stand-alone Information Literacy module
  • From academics interviewed for our project
  • n.b. I do know there are interesting examples of
    assessment of information literacy elsewhere!

22
Example of mixing factors modes
  • Module taken by students on BSc Information
    Management - 25 this year
  • Level 1 semester 1
  • 20 credits (i.e. a third of what they do in this
    semester)
  • 3 hours most weeks 1 hr lecture followed by 2
    hours in computer lab
  • WebCT to support class

23
Information Literacy our definition
Information literacy is the adoption of
appropriate information behaviour to identify,
through whatever channel or medium, information
well fitted to information needs, leading to wise
and ethical use of information in
society. (Johnston Webber)
24
SCONUL 7 pillars of information literacy
Recognise information need
Distinguish ways of addressing gap
Basic Library Skills IT Skills
Construct strategies for locating
Information Literacy
Locate and access
Compare and evaluate
Organise, apply and communicate
Synthesise and create
http//www.sconul.ac.uk/
25
  • (10) Review of a website, article or book
  • (50) Reflection on achievement in each of SCONUL
    7 pillars (1,500-1,750 words) plus portfolio of
    evidence including
  • Before/after mindmaps
  • Bibliography produced for student client
  • Presentations
  • Feedback from student client
  • Anything else (e.g. other classes)
  • (40) Examination

26
Reflection/ portfolio
  • Aims
  • To reflect on your understanding of information
    literacy
  • To improve your information searching skills by
    carrying out and evaluating a search for a
    real-life client
  • To provide the client with relevant information
  • To familiarise yourself with specific information
    sources
  • Standard coursework feedback sheet individual
    comments

27
  • What they don't get marks for includes
  • Feedback on ppt presentation on infolit
    strengths/ weaknesses in week 2 (from teaching
    staff peers)
  • Feedback on ppt presentation of group search task
    in week 6 (from teaching staff)
  • Feedback from peer and lecturer on reference
    interview in week 5/6
  • Feedback on "bibliography" from student client in
    week 10
  • It can be used as evidence in their portfolio

28
Exercises identifying evaluating websites in
pairs
ppts of evaluations posted to discussion list,
some presented
Short talk about evaluating information
1
Further material on evaluating, including
"Internet Detective"
2
Examining how other people evaluate or review
Short review of a website, article or book on
information literacy (marked)
3
etc. etc.
4
Group exercise searching evaluating information
on MMR vaccine
29
I and a colleague play 2 scenes in which a
librarian and information scientist do poor
interviews
Students asked for feedback on what went
well/badly
Short lecture on interviewing techniques
1
Further reading on interviewing given
After each interview, interviewer, interviewee
tutor write down comments, then give verbal
feedback Written comments copied to interviewer
Tutorial Interview a fellow student "client" to
find out what information the client wants
2
Student reflects on interview in portfolio
(marked)
3
30
The Research project
31
UK Higher Education context
  • Big expansion in number of students, without big
    expansion of number of lecturers
  • Polytechnics become universities too
  • Students having to pay for their education
    (part-time jobs, less time for study)
  • Research Assessment Exercise
  • Teaching quality assessment

You get marks! You get money!
32
Good bad
  • Good includes
  • Departments cannot ignore teaching!
  • "Official" view of good teaching learning
    assessment includes moving away from
    "transmission", passive "rote" learning, exams
  • Lecturers encouraged to develop pedagogy (e.g.
    Institute for Learning Teaching in Higher
    Education)
  • But research still more important in many
    universities!

33
Project description
  • Three-year, 130,000 Arts Humanities Research
    Board - funded project (November 2002- October
    2005)
  • To explore UK academics conceptions of, and
    pedagogy for, information literacy

34
Key aims
  • Investigate academics' educational practice as
    regards information literacy
  • Identify whether there are differences in
    conception and practice in different disciplines

35
Approach
  • Phenomenographic study Interviews and Analysis
    (months 1-21)
  • 20 interviews x 4 disciplines (73 so far)
  • Hard Pure Chemistry
  • Hard Applied Civil Engineering
  • Soft Pure English Literature
  • Soft Applied Marketing
  • Survey of wider practice Questionnaires and
    Analysis (months 22-36)

36
Interviews
  • Approx. 1 hour each
  • 3 basic questions
  • What is your conception of IL?
  • How do you engage your students in IL?
  • Do you assess IL directly or indirectly?
  • What is your conception of the Information
    Literate University?

37
Chemistry1
  • Interviewee " we've wedged it in and it's
    there, and it's a ten credit course, a half time
    credit course"
  • Interviewer "And it's assessed directly?"
  • Interviewee "It is assessed directly. The tasks
    that the students perform, they submit
    assessments on that, as far as I understand it. I
    don't run the course. And they are assessed on
    outcomes of those tasks"

38
Chemistry2
  • "They produce a portfolio. In fact, we are
    beginning to use the Royal Society of Chemistry's
    sort of idea of a portfolio for our students and
    the work that they do, in some of these tasks,
    goes into the portfolio. If they are in a
    foundation degree this is assessed as a part of a
    module in fact, but if they are in a BSc degree
    or MChem degree it's just something that they
    they have to do, and not actually marks. It will
    be assessed more strictly in the future, I
    think."
  • Awareness of need for skills for career

39
Chemistry3
  • "Well, information, I want students to be able to
    access it, work with it, to use it to push them
    forward.""there is still the bottom line that
    we've got to do Chemistry, so we tend to teach
    the transferable skills in a chemistry
    environment rather than chemistry over here and
    transferable skills over there..""It's assessed
    and they'll get to know the marks"
  • Marks for presentation oral, poster, webpage etc.

40
Chemistry4
  • "That would be indirectly through assignments and
    things. I mean, I think sometimes with the first
    years, they may have got something where they are
    told to find some references, I think there may
    be that"

41
English1
  • "that is most directly assessed in our third term
    where we actually ask students to do a research
    project and we assess what they've found, how
    well they've used it, and how they you know, we
    try to monitor, for example, they get a reading
    log where they have to fill in the references
    they are looking at and that translates into a
    bibliography and we actually try to get them to
    reflect back on what they have learned from doing
    this research project"

42
English2
  • Example of assessed group exercise concerning
    18th Century material - researching a subject and
    presenting findings

43
English3
  • "Indirectly, I think, things like referencing
    and whether or not they've managed to find books
    and use arguments""I engage with information
    literacy but I don't assess it directly"
  • Self-assessment sheet ("they know they've got to
    hand it in") in which student lists problems
    encountered and strengths of sources, plus mark
    they think they'll get

44
English4
  • "Yeah well, everything you ask students to do
    should be linked somehow to assessment otherwise
    they are not going to take it seriously. That's
    how I sell it, but then they usually all come
    back and say how much fun it was and that's like
    extra bonus, but for example, in the xxxx
    course the students had to write something every
    week on the bulletin board there and that was
    only linked to assessment in the non-assessed
    part of the course, so when they've done 10 of
    these things they'll get 15 of the course"

45
English6
English5
  • "Um pause I suppose indirectly assessed in
    that students are required to write essays which
    should reflect that criticism and those critical
    skills, and they are expected to do
    bibliographies for each of their essays as well,
    um"
  • "It's a means to an end, it's not the end"

46
English7
  • "It's not, no, I don't do assignments in that,
    because I'm not teaching a computer degree. I'm
    teaching. These are resources. I wouldn't, you
    know, because it makes the students feel small, I
    make sure they know the pathways, but I wouldn't
    do assessment"

English8
  • Interviewer "What about their information skills,
    would those be assessed in any way?
  • Interviewee "No"

47
Some thoughts
  • Academics valuing evaluation, use presentation
    of information (and also good citation practice)
  • In Chemistry ( engineering) emphasis on
    apprenticeship for real world seems to strengthen
    interest
  • Relevance authenticity important
  • Range of assessment methods
  • Variation by discipline, but also within it

48
Contacts
  • Sheila Webber s.webber_at_sheffield.ac.uk
  • http//ciquest.shef.ac.uk/infolit/ - weblog
  • http//dis.shef.ac.uk/literacy/

49
References
  • Biggs, J. (1999) Teaching for quality learning at
    University. Buckingham OUP.
  • Catts, R. (2003) Information Skills Survey for
    Assessment of Information Literacy in Higher
    Education. Administration Manual. Canberra,
    Council of Australian University Librarians. 
    ISBN  0 86803 999 3. 100 Australian dollars
    GST. Payment toCouncil of Australian University
    Librarians, LPO Box 8169 (Licensed Post Office),
    ANU, Canberra ACT 2601, Australia
  • Darling-Hammond Snyder, cited by Elton, M
    Johnston, B (2002) Assessment in universities a
    critical review of research. York Learning and
    Teaching Support Network. http//www.ltsn.ac.uk/em
    bedded_object.asp?id17161promptyesfilenameASS
    013

50
  • Mittermeyer, D. and Quirion, D (2003)
    Information Literacy Study of Incoming
    First-Year Undergraduates in Quebec. Québec
    Conférence des recteurs et des principaux des
    universités du Québec. http//crepuq.qc.ca/documen
    ts/bibl/formation/studies_Ang.pdf
  • Mogg, R. (2002) An investigation into the
    information literacy skills needs of first-year
    undergraduates and into an appropriate method of
    assessing incoming students' information literacy
    abilities at Cardiff University. MA dissertation.
    University of Sheffield http//dis.shef.ac.uk/disp
    ub/ (search on Mogg)

51
  • Rust, C. (2001) A briefing on assessment of large
    groups. York Learning and Teaching Support
    Network. http//www.ltsn.ac.uk/embedded_object.as
    p?id17152promptyes filenameASS012
  • Webber, S. and Johnston, B. (2003) "Assessment
    for information literacy vision and reality."
    In Martin, A. and Rader, H. (Eds) Information
    and IT literacy enabling learning in the 21st
    Century. London Facet. pp101-111.

52
Learning design
Learning purposes
Information rich
Proactive
Alignment T/L/A for IL
Design of Learning Teaching
Evaluation/ redesign
Constructivist Relational
Developmental
Assessment of learning
Bill Johnston Sheila Webber, 2002
Credit bearing
Complex
Write a Comment
User Comments (0)
About PowerShow.com