CS 430 INFO 430 Information Retrieval - PowerPoint PPT Presentation

1 / 24
About This Presentation
Title:

CS 430 INFO 430 Information Retrieval

Description:

Designers and evaluators need to work as a team. Designers are ... Do not allow evaluators to become designers. 17. Usability experiment: Ordering of results ... – PowerPoint PPT presentation

Number of Views:61
Avg rating:3.0/5.0
Slides: 25
Provided by: wya1
Category:

less

Transcript and Presenter's Notes

Title: CS 430 INFO 430 Information Retrieval


1
CS 430 / INFO 430 Information Retrieval
Lecture 23 Usability 2
2
Course Administration
Wednesday, November 16 No discussion
class Thursday, November 17 No lecture No
office hours
3
Course Administration
Assignment 3 Grades were returned yesterday If
you have any questions, send email to
cs430-l_at_lists.cs.cornell.edu Replies may be
delayed until next week
4
Usability Factors in Searching
Example Design an interface for a simple
fielded search. Interface Fill in boxes, text
string, ... ? Presentation of results ...
? Manipulation of results ... ? Functions
Specify field(s), content, operators, ...
? Retain results for manipulation ... ? Query
options ... ? Data Metadata formats ... ? Data
structures and file structures ...
? Systems Performance ... ?
5
The Design/Evaluate Process
Requirements (needs of users and other
stakeholders)
Evaluation
start
Implementation (may be prototype)
Design (creative application of design principles)
release
6
Evaluation
  • What is usability?
  • Usability comprises the following aspects
  • Effectiveness the accuracy and completeness
    with which users achieve certain goals Measures
    quality of solution, error rates
  • Efficiency the relation between the
    effectiveness and the resources expended in
    achieving themMeasures task completion time,
    learning time, clicks number
  • Satisfaction the users comfort with and
    positive attitudes towards the use of the
    systemMeasures attitude rating scales
  • From ISO 9241-11

7
Evaluation
  • The process of determining the worth of, or
    assigning a value to, the usability on the basis
    of careful examination and judgment.
  • Making sure that a system is usable before
    launching it.
  • Iterative improvements after launch.
  • Categories of evaluation methods
  • Analytical evaluation without users
  • Empirical evaluation with users
  • Measurements of operational systems

8
Evaluation without Users
  • Assessing systems using established theories and
    methods
  • Evaluation techniques
  • Heuristic Evaluation (Nielsen, 1994)
  • Evaluate the design using rules of the thumb
  • Cognitive Walkthrough (Wharton et al, 1994)
  • A formalized way of imagining peoples thoughts
    and actions when they use the interface for the
    first time
  • Claims Analysis based on scenario-based
    analysis
  • Generating positive and negative claims about the
    effects of features on the user

9
Evaluation with Users
  • Testing the system, not the users!
  • Stages of evaluation with users

User testing is time-consuming and expensive.
10
Evaluation with UsersPreparation
  • Determine goals of the usability testing
  • The user can find the required information in
    no more than 2 minutes
  • Write the user tasks
  • Answer the question how hot is the sun?
  • Recruit participants
  • Use the descriptions of users from the
    requirements phase to detect potential users

11
Usability Laboratory
Concept monitor users while they use system
Evaluators User
one-way mirror
12
Evaluation with UsersSessions Conduct
  • Conduct the session
  • Usability Lab
  • Simulated working environment
  • Observe the user
  • Human observer(s)
  • Video camera
  • Audio recording
  • Inquire satisfaction data

13
Evaluation with UsersResults Analysis
  • If possible, use statistical summaries
  • Pay close attention to areas where users
  • were frustrated
  • took a long time
  • couldn't complete tasks
  • Respect the data and users' responses, don't make
    excuses for designs that failed
  • Note designs that worked and make sure they're
    incorporated in the final product

14
Measurements on operational systems
  • Analysis of system logs
  • Which user interface options were used?
  • When was was the help system used?
  • What errors occurred and how often?
  • Which hyperlinks were followed (click through
    data)?
  • Human feedback
  • Complaints and praise
  • Bug reports
  • Requests made to customer service

15
The Search Explorer Application Reconstruct a
User Sessions
16
Refining the design based on evaluation
Designers and evaluators need to work as a
team Designers are poor evaluators of their own
work, but know the requirements, constraints,
and context of the design Some user problems
can be addressed with small changes Some user
problems require major changes Some user
requests (e.g., lots of options) are incompatible
with other requests (e.g., simplicity) Do not
allow evaluators to become designers
17
Usability experiment Ordering of results
The order in which the hits are presented to the
user Ranked by similarity of match (e.g., term
weighting) Sorted by a specified field (e.g.,
date) Ranked by importance of document as
calculated by some algorithm (e.g., Google
PageRank) Duplicates shown separately or merged
into a single record Filters and other user
options What impact do these choices have on the
usability?
18
Experiment on the Google Interface
Methodology 10 information seeking tasks in 2
categories Users randomized across
tasks Click-through data to see what the user
did Eye tracking data to see what the user
viewed Google results presented with top ten
ranks reversed An example of interdisciplinary
information science research by Cornell's Human
Computer Interaction Group and Computer Science
Department
19
Evaluation Example Eye Tracking
20
Evaluation Example Eye Tracking
21
Google EvaluationClick-Through Data
Number of users who clicked on link
Rank of hit
22
Google Evaluation Eye Tracking Data
Number of users who viewed short record before
first click
Rank of hit
23
Google Evaluation Eye Tracking Data
Part of short record viewed before first click (
of users)
Title 17.4 Snippet 42.1 Category 1.9 URL
30.4 Other 8.2 (includes, cached, similar
pages, description)
24
Google Experiment Click Through Data with Ranks
Reversed
Percentage of users who clicked on link
Rank of hit
Write a Comment
User Comments (0)
About PowerShow.com