Title: Usability Assessment of Academic Digital Libraries
1Usability Assessment of Academic Digital
Libraries
- Judy Jeng
- Library Assessment Conference
- September 25-27, 2006
2What is Usability?
- A multidimensional construct
- Interface effectiveness, usefulness, usableness,
ease of use, fit for use, effectiveness,
efficiency, satisfaction, learnability,
memorability, error tolerant, understandability,
appropriate level of interaction, control,
helpfulness, adaptability, the quality of being
engaging, and flexibility - User focus
3Techniques of Usability Evaluation
- Formal usability testing
- Usability inspection
- Heuristic evaluation
- Card sort
- Category membership expectation
- Cognitive walkthrough
- Claims analysis
- CASSM
- Focus groups
- Questionnaires
- Think aloud
- Analysis of site usage logs
- Paper prototyping
- Field study
4What have been studied?
- Design
- Structure
- Interface
- Navigation
- Functionality
- Utility
- Breadth of coverage
- Metadata appropriateness
- Awareness of library resources
- Terminology
5Major Usability Problems in Digital Libraries
- User lostness
- The study of ACM Digital Library, the Networked
Computer Science Technical Reference Library, and
the New Zealand Digital Library found that 73 of
the subjects experienced different degrees of
lostness - Alexandria Digital Library also report this
problem - Navigation disorientation is among the biggest
frustrations for Web users
6- Lack of Benchmarks
- MIT Libraries had 75 success rate. Is this high
or low? - University of the Pacific and University of
Illinois at Chicago also report their performance
data. - We need more performance data to compare
7- Ambiguity of terminology and the need for better
labeling - Library Web sites are designed from a librarians
perspective - When users do not find something in the online
catalog, they immediately conclude that the
library does not own the item
8Usability Evaluation Model
Effectiveness
Ease of Use
Efficiency
Organization of Info.
Usability
Labeling
Satisfaction
Visual Appearance
Content
Learnability
Error Correction
9Methods
- Formal usability testing
- Questionnaire
- Interview
- Think aloud
- Log analysis
10Test Sites
- Rutgers University Libraries Web site
- Queens College Web site
- It is a cross-institutional usability study
11Stages
- ? Feb/Mar 2004
- - 11 subjects (5 from Rutgers, 6 from Queens)
- ? September/October 2004
- - 30 subjects (15 from Rutgers, 15 from Queens)
12Instruments
- 9 tasks, representative of typical uses of a
librarys Web site - 3 questions to locate known items
- 4 are to find articles
- 2 are to locate information
13Results
- Effectiveness (by percentage of correctness)
- Overall
- Breakdown by task
- Efficiency (time, step)
- Overall
- Breakdown by task
14Learnability
- Learnability is in some sense the most
fundamental usability attribute. The system
should be easy to learn so that the user can
rapidly start getting some work done with the
system - Ask Rutgers subjects to search Queens site and
vice versa
15Criteria of Learnability
- How soon can participant begin searching
- Correctness of answers
- How much time it takes
16Satisfaction
- Likert scales and Interview provide ratings and
comments - Overall satisfaction
- Ease of Use
- Organization of Information
- Terminology
- Attractiveness
- Mistake Recovery
17Users Criteria of Ease of Use
- Easy to get around
- Can follow directions easily
- Easy navigation
- Clear description
- Intuitive
- User-friendly
18Users Criteria of Organization of Information
- Simple
- Straightforward
- Logical
- Easy to look up things
- Place common tasks upfront
19Users Criteria of Terminology
- Simple
- Straightforward
- Understandable
- Generic
- Label sections clearly
- No jargon
- Clear descriptions/explanations
- From users perspective
20Users Criteria of Visual Attractiveness
- Appropriate graphics
- Readability
- Appropriate color
- Not too complicated
- Appropriate size of font
21Users Criteria of Mistake Recovery
- Easy navigation
- Navigation bar
- Back button
- Online Help instruction
22- The research instruments also solicit comments on
systems best features, worst features, desired
features - Those comments are helpful for system improvement
23User Lostness
- User lostness occurs when users cannot identify
where they are, cannot return to previously
visited information, cannot go to information
believed to exist, and cannot remember the key
points covered - 46 of subjects felt lost in Rutgers site
- 57 felt lost in Queens site
- Reasons of lostness
- Confusing structure of site design
- Lack of Back button
- Lack of appropriate button to start over
- Difficulty of the particular task
- The participants level of confidence
24Effectiveness and Satisfaction
- The dependent variable was satisfaction
- (1high satisfaction, 5low satisfaction)
- The independent variable was correctness of
answers (0wrong answer, 1correct) - The ANOVA was significant
- This means that subjects feel less satisfied with
the system when they fail to perform the task
correctly
25Efficiency and Satisfaction
- The correlation between the numbers of steps and
satisfaction was significant. This means that the
more steps used to answer a question, the lower
the satisfaction. - The correlation between the time spent and
satisfaction was significant. This means the more
time spent on answering a question, the lower the
satisfaction.
26Effectiveness and Efficiency
- Effectiveness and Steps
- The independent variable, effectiveness, has two
levels correct and incorrect answer - The dependent variable number of step
- The ANOVA was significant. Incorrect answers
involved more steps while correct answers
involved fewer steps. - This means when subjects knew how to get the
answer, it took them fewer steps while without
that knowledge, they struggled.
27- Effectiveness and Time
- The independent variable effectiveness
- The dependent variable time
- The ANOVA was significant
- Correct answer involved less time while incorrect
answer involved longer time
28- Based on the results of those statistical
analyses, the study found that there exist
interlocking relationships among effectiveness,
efficiency, and satisfaction. Their relationships
range from medium to strong. - Frøkjær, Hertzum, and Hornbæk (2000) found that
effectiveness and efficiency are either not
correlated or correlated so weakly that the
correlation is negligible for all practical
purposes. - Walker et al. (1998) found user satisfaction is
not determined by efficiency.
29- Although there are interlocking relationships
among effectiveness, efficiency, and
satisfaction, these three attributes should be
measured separately. One cannot replace the
other.
30Ease of Use and Ease of Learning
- A strong correlational relationship between ease
of use and ease of learning - Ease of use data are from ratings in post-test
questionnaire - Ease of learning are from number of tasks
completed correctly on a new site - Subjects gave better ratings for ease of use of
the new site if they could complete more tasks
successfully
31Navigation
- 71 said the Rutgers site was easy to navigate
- 56 said the Queens site was easy to navigate
- Links should be stable and self-explanatory
- Queens sites drop-down menu was over-sensitive
and disappeared when the mouse moved - Need easy route back to the home page
- Consistency of navigation bar across all pages
32Click Cost
- Users are very reluctant to click unless they are
fairly certain they will discover what they are
looking for (McGillis and Toms, 2001) - 73 of the participants declared that they expect
the click(s) to lead them eventually to the
correct answer
33Demographic Factors and Performance
- There is no statistically significant
relationship between demographic factors (gender,
age, status, major, ethnic background, years at
the institution, and frequency of using the site)
and effectiveness - There is no statistically significant
relationship between demographic factors (gender,
age, status, major, ethnic background, years at
the institution, and frequency of using the site)
and efficiency
34Gender and Satisfaction
- There are no statistically significant
relationships between gender and the factors of
satisfaction (ease of use, organization of
information, terminology, visual attractiveness,
and mistake recovery)
35Ethnic Background and Satisfaction
- There are probably different attitudes among
different ethnic groups on the ratings of
satisfaction - There are statistically significant relationships
between ethnic background and the ratings of
Queens sites ease of use, organization, and
visual attractiveness, but this is not found on
Rutgers site - Cultural usability is an interesting field to
explore
36Contributions
- The provision of an evaluation model
- The provision of performance data for comparison
- The demonstration of interlocking relationships
among effectiveness, efficiency, and satisfaction - The finding of a correlational relationship
between ease of use and ease of learning - The establishment of operational criteria and
strategies to measure effectiveness, efficiency,
satisfaction, and learnability
37- The identification of the causes of user lostness
- The identification of factors that contribute to
ease of navigation - The confirmation of click cost
- The establishment of the fact that demographic
factors (gender, age, status, academic major,
ethnic background, years at the institution, and
frequency of using the site) do not have a
statistical significance on performance
38- The indication that ethnic background may affect
satisfaction ratings - The identification of users criteria for
evaluating ease of use, organization,
terminology, visual attractiveness, and mistake
recovery - A review of the way usability has been and should
be defined in the context of the digital library - A review of the usability evaluation methods that
have been applied in academic digital libraries