Title: CS 430 INFO 430 Information Retrieval
1CS 430 / INFO 430 Information Retrieval
Lecture 23 Usability 2
2Course Administration
Wednesday, November 16 No discussion
class Thursday, November 17 No lecture No
office hours
3Course Administration
Assignment 3 Grades were returned yesterday If
you have any questions, send email to
cs430-l_at_lists.cs.cornell.edu Replies may be
delayed until next week
4Usability Factors in Searching
Example Design an interface for a simple
fielded search. Interface Fill in boxes, text
string, ... ? Presentation of results ...
? Manipulation of results ... ? Functions
Specify field(s), content, operators, ...
? Retain results for manipulation ... ? Query
options ... ? Data Metadata formats ... ? Data
structures and file structures ...
? Systems Performance ... ?
5The Design/Evaluate Process
Requirements (needs of users and other
stakeholders)
Evaluation
start
Implementation (may be prototype)
Design (creative application of design principles)
release
6Evaluation
- What is usability?
- Usability comprises the following aspects
- Effectiveness the accuracy and completeness
with which users achieve certain goals Measures
quality of solution, error rates - Efficiency the relation between the
effectiveness and the resources expended in
achieving themMeasures task completion time,
learning time, clicks number - Satisfaction the users comfort with and
positive attitudes towards the use of the
systemMeasures attitude rating scales - From ISO 9241-11
7Evaluation
- The process of determining the worth of, or
assigning a value to, the usability on the basis
of careful examination and judgment. - Making sure that a system is usable before
launching it. - Iterative improvements after launch.
- Categories of evaluation methods
- Analytical evaluation without users
- Empirical evaluation with users
- Measurements of operational systems
8Evaluation without Users
- Assessing systems using established theories and
methods - Evaluation techniques
- Heuristic Evaluation (Nielsen, 1994)
- Evaluate the design using rules of the thumb
- Cognitive Walkthrough (Wharton et al, 1994)
- A formalized way of imagining peoples thoughts
and actions when they use the interface for the
first time - Claims Analysis based on scenario-based
analysis - Generating positive and negative claims about the
effects of features on the user
9Evaluation with Users
- Testing the system, not the users!
- Stages of evaluation with users
User testing is time-consuming and expensive.
10Evaluation with UsersPreparation
- Determine goals of the usability testing
- The user can find the required information in
no more than 2 minutes - Write the user tasks
- Answer the question how hot is the sun?
- Recruit participants
- Use the descriptions of users from the
requirements phase to detect potential users
11Usability Laboratory
Concept monitor users while they use system
Evaluators User
one-way mirror
12Evaluation with UsersSessions Conduct
- Conduct the session
- Usability Lab
- Simulated working environment
- Observe the user
- Human observer(s)
- Video camera
- Audio recording
- Inquire satisfaction data
13Evaluation with UsersResults Analysis
- If possible, use statistical summaries
- Pay close attention to areas where users
- were frustrated
- took a long time
- couldn't complete tasks
- Respect the data and users' responses, don't make
excuses for designs that failed - Note designs that worked and make sure they're
incorporated in the final product
14Measurements on operational systems
- Analysis of system logs
- Which user interface options were used?
- When was was the help system used?
- What errors occurred and how often?
- Which hyperlinks were followed (click through
data)? - Human feedback
- Complaints and praise
- Bug reports
- Requests made to customer service
15The Search Explorer Application Reconstruct a
User Sessions
16Refining the design based on evaluation
Designers and evaluators need to work as a
team Designers are poor evaluators of their own
work, but know the requirements, constraints,
and context of the design Some user problems
can be addressed with small changes Some user
problems require major changes Some user
requests (e.g., lots of options) are incompatible
with other requests (e.g., simplicity) Do not
allow evaluators to become designers
17Usability experiment Ordering of results
The order in which the hits are presented to the
user Ranked by similarity of match (e.g., term
weighting) Sorted by a specified field (e.g.,
date) Ranked by importance of document as
calculated by some algorithm (e.g., Google
PageRank) Duplicates shown separately or merged
into a single record Filters and other user
options What impact do these choices have on the
usability?
18Experiment on the Google Interface
Methodology 10 information seeking tasks in 2
categories Users randomized across
tasks Click-through data to see what the user
did Eye tracking data to see what the user
viewed Google results presented with top ten
ranks reversed An example of interdisciplinary
information science research by Cornell's Human
Computer Interaction Group and Computer Science
Department
19Evaluation Example Eye Tracking
20Evaluation Example Eye Tracking
21Google EvaluationClick-Through Data
Number of users who clicked on link
Rank of hit
22Google Evaluation Eye Tracking Data
Number of users who viewed short record before
first click
Rank of hit
23Google Evaluation Eye Tracking Data
Part of short record viewed before first click (
of users)
Title 17.4 Snippet 42.1 Category 1.9 URL
30.4 Other 8.2 (includes, cached, similar
pages, description)
24Google Experiment Click Through Data with Ranks
Reversed
Percentage of users who clicked on link
Rank of hit