Measuring the Impact of Electronic Resources: - PowerPoint PPT Presentation

1 / 14
About This Presentation
Title:

Measuring the Impact of Electronic Resources:

Description:

6th Northumbria International Conference on Performance Measures in Libraries ... note-taker. 9 interviews. c.1 hour. Faculty Librarian note-taker. Number ... – PowerPoint PPT presentation

Number of Views:32
Avg rating:3.0/5.0
Slides: 15
Provided by: XPU
Category:

less

Transcript and Presenter's Notes

Title: Measuring the Impact of Electronic Resources:


1
  • Measuring the Impact of Electronic Resources
  • developing simple tools
  • Jill Beard, Jacqueline Chelin, Rachel Geeson,
    Jonathan Hutchins, Dianne Nelson, Jane Redman,
    Pauline Shaw, Chris Spencer
  • Bournemouth University
  • University of West of England, Bristol
  • 6th Northumbria International Conference on
    Performance Measures in Libraries and Information
    Services
  • 22 - 25 August 2005, Durham

2
  • Introduction
  • LIRG/SCONUL Impact Implementation Initiative
  • Why collaborate?
  • common theme
  • similar challenges
  • compare and contrast different populations / use
    of tools
  • Both differences and similarities were spurs to
    collaboration
  • Impact
  • generally something has an impact when it
    results in a change of some sort
  • any effect of a service (or other event) on
    an individual or group (Brophy 2002)
  • Why action research?
  • a multidisciplinary, experimental research
    method that relates knowledge to practice. It
    involves collaborative partnerships and is
    participative and grounded with practical
    concerns (Reason Bradbury 2001)
  • a basis for planning action, initiating change,
    monitoring results and repeating the cycle

3
  • Project Plans

4
  • Methodologies 1
  • Online questionnaire surveys

5
  • Methodologies 23
  • Semi-structured interviews
  • Participant observation (UWE only)
  • 13 researchers from 6 faculties
  • brief factual questionnaire
  • observation of a set of tasks
  • training from library staff on basis of observed
    behaviour
  • short follow-up interview 1 month later

6
  • Results 1
  • Response rates to questionnaire surveys
  • BU
  • 251 responses from students
  • 53 responses from academic staff
  • UWE
  • 217 responses from students
  • Effect of BU incentive?

7
  • Results 2
  • Awareness and range of e-resources accessed
    (students)
  • UWE 30 e-resources identified as used by students
  • Student awareness derived from Faculty sources
    in the ratio of 6535 compared to Library sources
  • BU 22 e-resources selected as most recently
    accessed from list of 27
  • 5 e-resources not recently accessed are all
    listed on Librarys IHCS subject webpages
  • 4 of the e-resources recently accessed are
    neither identified nor recommended by Library
    staff
  • Student behaviour in e-resource choice is only
    in part influenced by librarians teaching and
    recommendations. Academic staff are key

8
  • Results 3
  • Awareness and range of e-resources accessed
    (staff/researchers)
  • UWE All had used some library databases
  • c.20 e-resources identified as regularly used
  • Most accessed via library web-pages
  • Viewed as essential/uniquely valuable
    (researchers)
  • BU All used some library e-resources
  • Little distinction made between subscribed
    e-resources and free, or those sourced from
    elsewhere
  • Promotion of e-resources
  • All staff at both institutions promoted
    e-resources to their students

9
  • Results 4
  • Increased use of and remote access to e-resources
  • Increase in use of e-resources from previous year
  • UWE 72 of students (48 increased, 24
    greatly increased)
  • BU 66.5 of students and 66 of staff
  • Remote access (students)
  • UWE 63
  • BU 72.5

10
  • Results 5
  • Integration of EIS into VLEs/the curriculum
  • UWE
  • 7 of 9 academic staff were working to integrate
    EIS into the curriculum
  • Pointers to relevant EIS in assignment briefs
  • Requiring EIS to be used when completing
    assignments
  • Including references to EIS in module handbooks
  • 7 of 9 academic staff were using or intending to
    use the VLE to link to EIS
  • BU
  • Staff routinely recommended EIS and included them
    in reading lists
  • Course information on VLE is linked to online
    reading lists

11
  • Results 6
  • Expressed satisfaction with e-resources (BU
    students)
  • 115 free-text responses on e-resources in general
  • Can be grouped into three broad areas
  • Access and searching (37)
  • Time-consuming/difficult/complicated/retrieving
    irrelevant records
  • Full-text availability (29)
  • AI databases not linking to f/t need more
    full-text journals
  • Praise (useful/excellent/indispensable) (17)
  • Many responses expressed both positive and
    negative
  • desktop availability of full-text articles
  • - access and searching as difficult/impossibly
    complicated etc.
  • Library staff need to work both with students
    to identify and clarify issues of access and
    searching, and with e-providers to simplify and
    clarify access issues where possible

12
  • Discussion 1
  • Methodologies Online Questionnaires
  • Simple to administer analyse
  • Low in cost
  • Accessible by many potential respondents both on
    and off campus
  • Self-selecting attracting the more IT-literate!
  • 96 of UWE respondents claimed good/very good IT
    skills
  • 9.2 of BU students claimed no confidence in
    using e-resources
  • Both institutions plan to use similar surveys
    again as part of regular service evaluation

13
  • Discussion 2
  • Methodologies Interviews/Participant
    Observation
  • Greater range/depth of response than possible
    with questionnaires
  • Misunderstandings can be clarified by interviewer
  • Greatly enhanced relationships with users
  • Deeper understanding of academic processes by
    library staff
  • Opportunity for promotion of specific aspects of
    library service
  • PR value of profile-raising of library in general
  • Time consuming both to conduct and analyse
  • Small population sample
  • Self-selecting!
  • Interviewer bias/comprehension

14

  • Discussion 3
  • Evidence of impact
  • Limited as yet because change can only be
    measured over time this was first year of study
    for both institutions
  • Baseline data both quantitative and qualitative
  • Triangulation needed to establish impact
  • Self-selected response to questionnaires,
    interviews and observational study
  • Growing importance for both staff and students
  • Similarity between results for BU and UWE
    indicating some consistent measure of impact was
    achieved
  • Future
  • Continue to monitor e-resource usage through
    Athens logins etc.
  • Use online questionnaires to measure change in
    perceptions and use
  • Use interviews/participant observation techniques
    (UWE)
  • Recognise centrality of academic staff as key to
    influencing student behaviour
  • Review and develop teaching and training
    resources in line with student comments
Write a Comment
User Comments (0)
About PowerShow.com