Information literacy training: How can we know we are not wasting our time - PowerPoint PPT Presentation

1 / 37
About This Presentation
Title:

Information literacy training: How can we know we are not wasting our time

Description:

Information literacy training: How can we know we are not wasting our time? ... Is Information literacy skills training 'like a dog's walking on his hind legs. ... – PowerPoint PPT presentation

Number of Views:49
Avg rating:3.0/5.0
Slides: 38
Provided by: Boo24
Category:

less

Transcript and Presenter's Notes

Title: Information literacy training: How can we know we are not wasting our time


1
Information literacy training How can we know we
are not wasting our time?
  • Andrew Booth, Reader in Evidence Based
    Information Practice, ScHARR, University of
    Sheffield

2
A Valentines Day Present!
What line of argument/ piece of evidence would
you use to justify continuation of this service?"
Allan Sack-arine
The Chief Executive has just proposed that all
information skills training in your organisation
be scrapped and staff spend their time on more
productive activities instead.
3
Information skills training (IST) a systematic
review of the literature (Brettle, 2003)
  • RESULTS
  • Study designs included randomised controlled
    trials, cohort designs and qualitative studies.
    Most took place in US medical schools. Wide
    variations in course content and training
    methods. 8 studies used objective testing, 2
    compared training methods and 2 examined effects
    on patient care.
  • Limited evidence to show training improves
    skills, insufficient evidence to determine most
    effective methods of training and limited
    evidence to show training improves patient care.
    Further research is needed.

4
What do we typically evaluate?
  • Was this training session a pleasurable
    experience for you?
  • ? ? ?
  • We know this because they usually mention the
    biscuits! Or the heat of the room!
  • Compared with.? Blood, faeces, urine,
    gangrene, death?

5
Evaluation bypass
  • We interpret mentions of the biscuits or the heat
    of the room as at least this means that there
    is nothing else more important to worry about
    (false positives).
  • Whereas it could mean that this is what they
    consider a successful outcome from the session
    OR
  • That we have designed a good instrument for
    assessing their physical comfort!

6
Why evaluate your training?
  • To help decide whether techniques and methods
    used are worthwhile and making a difference
    (evidence based library information practice)
  • To help understand whether we are making the best
    use of our resources
  • To demonstrate a need for further funding or a
    redistribution of funding
  • To help us improve our service
  • To help redesign materials and methods
  • Towards assessing library performance

NLH Librarians DL-Net
7
How should we evaluate?
  • We should evaluate
  • The right population
  • The right intervention
  • The right comparison
  • The right outcomes using
  • The right outcome measurement tool at
  • The right measurement interval

8
The right population
  • Most studies are conducted on students well
    they couldnt get rats or guinea pigs!
  • They usually involve volunteers or those
    fulfilling a mandatory requirement not general
    populations

9
And what about the non-users how do we reach
them?
10
And one size does not fit all
  • Training courses are designed around an average
    user (the one with 2.4 children!)
  • At any point in time almost half our users are
    getting more than they want and the other
    (almost) half are getting stuff they already know
  • They either decide what they need (and they are
    notorious for not being able to self-assess their
    competency e.g. beginners, intermediate,
    advanced) OR we decide what is best for them

11
Not having information skills training can be..
  • good because you have to learn the hard way.
    You tend to have better skills if you have had to
    find your own way around - but it would've been
    nice to have been eased into it, in a bit more
    structured way as well
  • Goodall, D. L. and P. Brophy (1997). A comparable
    experience? library support for franchised
    courses in higher education. British Library
    Research and Innovation Report 33. Preston,
    CERLIM, University of Central Lancashire viii,
    233p.

12
The right intervention
  • Most training is stand-alone and/or opportunistic
  • Often at the beginning of courses or included in
    inductions
  • We have no idea what is the right dosage!
  • Usually in classrooms/laboratories away from the
    workplace
  • Tension between real-life examples and examples
    that work

13
If information skills training was a drug it
would never get licensed!
  • But at least we arent doing any harm..

14
Can we be sure?
  • Concept of opportunity cost what could we be
    doing instead? What could they be doing instead?
  • If the added value of our training is close to
    zero is it worth doing? Is it ethical to keep
    on doing it if we dont know that it works?
  • Are librarians partly/totally responsible for
    anxieties regarding information
    overload/information explosion?
  • Might our teaching them the proper way to
    search actually impair their effectiveness if
    all roads lead to Rome (or Google)?

15
Standalone versus integrated
Coomarasamy Khan BMJ 2004329 1017-9.
16
Effective Methods for Teaching Information
Literacy Skills to Undergraduate Students
(Koufogiannakis Wiebe, 2006)
  • METHODS
  • To assess which library instruction methods are
    most effective at undergraduate level
  • 4356 citations retrieved from 15 databases. From
    257 full articles, 122 unique studies underwent
    data extraction and critical appraisal. 55 met
    quality criteria. 16 provided sufficient
    information for meta-analysis.79 studies (65)
    used experimental/quasi-experimental methods.
    Most focused on traditional teaching, followed by
    computer assisted instruction (CAI), and
    self-directed independent learning. Outcomes
    correlated with Blooms lower levels of learning
    (Remember, Understand, Apply).

17
Effective Methods for Teaching Information
Literacy Skills to Undergraduate Students
(Koufogiannakis Wiebe, 2006)
  • RESULTS
  • Traditional vs no instruction (12/16 found
    positive outcome). Meta-analysis of 4 studies
    favoured traditional instruction. 14 studies
    compared CAI vs traditional instruction with
    neutral result confirmed by meta-analysis. 6
    compared self-directed independent learning with
    no instruction, with positive result confirmed by
    meta-analysis.
  • CAI is as effective as traditional instruction.
    Traditional instruction and self-directed
    independent instruction are more effective than
    no instruction. Future research needs to compare
    active learning, computer assisted instruction,
    and self-directed independent learning.

18
The right comparison
  • What is the right comparison for our information
    literacy training?
  • Doing nothing? (Great!)
  • Placebo? (Were pretty good at delivering
    something that looks like the real thing but may
    not have an active ingredient!)
  • Other forms of education?
  • Obtaining information for them
  • Do we just assume that the importance of
    information skills training is self-evident? Is
    this why we now call it information literacy?

19
Information skills training versus mediated
searching
  • To compare effectiveness and costs of mediated
    searches and information-skills training.
  • Using satisfaction and use of skills both
    mediated searches and information skills training
    are effective. Cost-effectiveness depends on
    whether costs are viewed from a library or trust
    point of view. Providing information skills
    training does not reduce volume of
    mediated-search requests.
  • Neither method more effective/cost-effective than
    other. Decisions cannot be made on effectiveness
    or costs alone views of library staff and
    professionals should also be taken into account.
    A proactive approach and targeting training
    towards those most likely to benefit may be
    appropriate.

Brettle A et al (2006) HILJ 23 (4), 239247
20
The right outcomes
More meaningful measures
Knowledge
Attitudes
Behaviour
Outcomes (Health)
More difficult to establish cause and effect
Requires more robust designs
21
Information literacy outcomes - questionnaires
  • Knowledge Which of the following best describes
    the Cochrane Library?
  • Attitudes Rank the following databases in the
    order you are most likely to use them
  • Behaviour How many times have you used PsycLit
    since the training session?
  • Outcomes Please give an instance where your use
    of the Cochrane Library has impacted on patient
    care

22
Information literacy outcomes other methods
  • Knowledge Objective Structured Clinical
    Examination (OSCE)
  • Attitudes Likert scales
  • Behaviour Observation, Transaction Logging
    Software
  • Outcomes Observation, Patient Notes, Hospital
    Records, Critical incident technique

23
From output to outcome to impact
Changes to clinical decisions
Patients getting better
Training received
24
What is impact?
  • any effect of a service, product or other event
    on an individual or group. It
  • may be positive or negative
  • may be what was intended or something entirely
    different
  • may result in changed
  • attitudes
  • behaviours
  • outputs (i.e. what an individual or group
    produces during or after interaction with the
    service)
  • may be short or long term
  • may be critical or trivial.

Brophy, 2005
25
We need evidence on Impact!
  • Research that can provide rigorous evidence of
    outcomes is needed for managers to make decisions
    that will maximise the impact of library and
    information servicesThe Evidence Based
    Librarianship movement proposes new standards for
    research that can be applied to outcomes research
    and also to the extensive work being done on
    service quality and satisfaction
  • Source Cullen, 2001

26
The right measurement tool
  • There is a shortage of validated measures
    available. Research could be undertaken to
    develop and validate measures to enable library
    and information professionals to evaluate the
    effects of their training more easily
  • (Brettle, 2003)
  • Further studies utilizing appropriate
    methodologies and validated research tools would
    enrich the evidence base.
  • (Koufogiannakis Wiebe, 2006)

27
The right measurement period
  • Straight after the training period? (captive
    audience!)
  • Three months, six months, nine months, twelve
    months
  • Measuring knowledge or practice?
  • What about refreshment of skills?

28
The training half-life
  • the time taken for half of knowledge acquired
    through training to undergo decay
  • ... at the very moment when a learning period is
    finished, the brain has not had enough time to
    integrate the new information it has
    assimilatedIt needs a few minutes to complete
    and link firmly all the interconnections within
    the new material to let it 'sink in'.
  • The decline that takes place after the small rise
    is a steep one within 24 hours of a one-hour
    learning period at least 80 per cent of detailed
    information is lost.
  • (Buzan, 2005)

29
An external example
  • Participants in sales training forget half of
    what they learn within five weeks - survey of
    more than 6,000 sales professionals
  • "Without regular reinforcement, sales training's
    half life is a median of just 5.1 weeks, which is
    even shorter than we suspected. Indeed, for 44
    of participants in the study the half life is
    less than a month." (American Salesman,  January,
    2004)

30
Discussion Point
  • What strategies could we put in place to aid
    recall and to alleviate onset of learning
    half-life?

31
In Summary
  • Given the above discussionIs Information
    literacy skills training like a dog's walking on
    his hind legs. It is not done well but you are
    surprised to find it done at all?

32
A role for Evidence Based Library and Information
Practice?
  • .seeks to improve library and information
    services and practice by bringing together the
    best available evidence and insights derived from
    working experience, moderated by user needs and
    preferences. EBLIP involves asking answerable
    questions, finding, critically appraising and
    then utilising research evidence from relevant
    disciplines in daily practice. It thus attempts
    to integrate user-reported, practitioner-observed
    and research-derived evidence as an explicit
    basis for decision-making. .
  • Source Booth, 2006

33
The Way Forward We need better evidence
  • Comparative
  • Prospective/Longitudinal
  • Clearly Described Intervention
  • Specific and Measured Outcome
  • In a relevant/comparable Study Population

34
The Way Forward We need to be more reflective
  • Evidence based practice is about best practice
    and reflective practice, where the process of
    planning, action, feedback and reflection
    contributes to the cyclic process of purposeful
    decision making and action, and renewal and
    development. (Todd, 2003)
  • Stimulus for reflective practice can be research,
    user views, practitioner observation,
    benchmarking, performance measurement etcetera
  • But must make a difference

35
Conclusions
  • Simply believing that we are doing a good job
    is not sufficient
  • Need to start with a very clear idea of what we
    want to achieve (Learning Objectives)
  • Need to measure what we have achieved
  • Need to use reliable instruments
  • Above all, need to continually reflect on what we
    are doing and why before Allan Sack-arine does!

36
Ones to Watch!
  • Evidence based practice in information literacy
    ANZIIL Research Working Group (Australian and New
    Zealand Institute for Information Literacy).
    Forthcoming, 2006
  • Evidence Based Library and Information Practice
    (Open Access Journal) ejournals.library.ualberta.c
    a/index.php/EBLIP
  • 4th International EBLIP Conference, May 6-11 2007
    www.eblip4.unc.edu

37
References - 1
  • Booth, A (2006). Counting what counts
    performance measurement and evidence-based
    practice. Performance Measurement and Metrics,
    7(2) 63-74
  • Brettle A. (2003) Information skills training a
    systematic review of the literature. Health Info
    Libr J. 2003 Jun20 Suppl 13-9.
  • Brophy, P (2005) The development of a model for
    assessing the level of impact of information and
    library services  Library and information
    research, 29 (93), Winter, 43-49
  • Cullen, R. (2001). Setting standards for library
    and information service outcomes, and service
    quality. 4th International Conference on
    Performance Measurement.
  • Koufogiannakis, D, and Wiebe, N (2006). Effective
    Methods for Teaching Information Literacy Skills
    to Undergraduate Students A Systematic Review
    and Meta-Analysis. Evidence Based Library and
    Information Practice 1(3) 3-43.
Write a Comment
User Comments (0)
About PowerShow.com