CS430: Information Discovery - PowerPoint PPT Presentation

1 / 24
About This Presentation
Title:

CS430: Information Discovery

Description:

A large collection of information viewed at many different scales. ... Software designer presents story board (mock-up) to user. 20. Eye Tracking. 21. Eye Tracking ... – PowerPoint PPT presentation

Number of Views:43
Avg rating:3.0/5.0
Slides: 25
Provided by: wya1
Category:

less

Transcript and Presenter's Notes

Title: CS430: Information Discovery


1
CS430 Information Discovery
Lecture 18 Usability 3
2
Course Administration
3
Information Visualization
Human eye is excellent in identifying patterns in
graphical data. Trends in time-dependent
data. Broad patterns in complex
data. Anomalies in scientific
data. Visualizing information spaces for
browsing.
4
Example Tilebars
The figure represents a set of hits from a text
search. Each large rectangle represents a
document or section of text. Each row represents
a search term or subquery. The density of each
small square indicates the frequency with which a
term appears in a section of a document.
Hearst 1995
5
Pad
Concept. A large collection of information
viewed at many different scales. Imagine a
collection of documents spread out on an enormous
wall. Zoom. Zoom out and see the whole collection
with little detail. Zoom in part way to see
sections of the collection. Zoom in to see every
detail. Semantic Zooming. Objects change
appearance when they change size, so as to be
most meaningful. (Compare maps.) Performance.
Rendering operations timed so that the frame
refresh rate remains constant during pans and
zooms.
6
Pad File Browser
7
Pad File Browser
8
Pad File Browser
9
Collection Viewer
Visualization of NSDL collections
using Inxight's Star Tree Viewer http//labview.
comm.nsdl.org/cgi-bin/wiki.pl?Visualization Conne
ct
10
D-Lib Working Group on Metrics
DARPA-funded attempt to develop a TREC-like
approach to digital libraries (1997). "This
Working Group is aimed at developing a consensus
on an appropriate set of metrics to evaluate and
compare the effectiveness of digital libraries
and component technologies in a distributed
environment. Initial emphasis will be on (a)
information discovery with a human in the loop,
and (b) retrieval in a heterogeneous world.
" Very little progress made. See
http//www.dlib.org/metrics/public/index.html
11
MIRA
Evaluation Frameworks for Interactive Multimedia
Information Retrieval Applications European study
1996-99 Chair Keith Van Rijsbergen, Glasgow
University Expertise Multi Media Information
Retrieval Information Retrieval Human Computer
Interaction Case Based Reasoning Natural
Language Processing
12
MIRA Starting Point
Information Retrieval techniques are beginning
to be used in complex goal and task oriented
systems whose main objectives are not just the
retrieval of information. New original research
in Information Retrieval is being blocked or
hampered by the lack of a broader framework for
evaluation.
13
MIRA Aims
Bring the user back into the evaluation
process. Understand the changing nature of
Information Retrieval tasks and their
evaluation. 'Evaluate' traditional
evaluation methodologies. Consider how
evaluation can be prescriptive of Information
Retrieval design. Move towards balanced
approach (system versus user). Understand
how interaction affects evaluation. Support
the move from static to dynamic evaluation.
Understand how new media affects evaluation.
Make evaluation methods more practical for
smaller groups. Spawn new projects to
develop new evaluation frameworks
14
MIRA Approaches
Developing methods and tools for evaluating
interactive Information Retrieval. Possibly the
most important activity of all. User tasks
Studying real users, and their overall goals.
Improve user interfaces is to widen the set of
users Develop a design for a multimedia test
collection. Get together collaborative
projects. (TREC was organized as
competition.) Pool tools and data.
15
Evaluation of Usability
Observing users (user protocols) Focus
groups Measurements effectiveness in
carrying out tasks speed Expert review
Competitive analysis
16
Focus Group
A focus group is a group interview Interviewer
Potential users Typically 5 to 12 Similar
characteristics (e.g., same viewpoint) Structure
d set of questions May show mock-ups Group
discussions Repeated with contrasting user
groups
17
Usability Laboratory
Concept Monitor users while they use system
Evaluators User
one-way mirror
18
Usability Laboratory
19
Usability Laboratory
Observing techniques Human observer Video
camera Tape recording Study techniques Human
protocol (user talks aloud while using
system) User carries out specified list of
tasks Software designer presents story board
(mock-up) to user
20
Eye Tracking
21
Eye Tracking
22
Measurement
Basic concept log events in the users'
interactions with a system Examples from a Web
system Clicks (when, where on screen,
etc.) Navigation (from page to
page) Keystrokes (e.g., input typed on
keyboard) Use of help system Errors May be
used for statistical analysis or for detailed
tracking of individual user.
23
The Search Explorer Application Reconstruct a
User Sessions
24
The Importance of Design
Good support for users is more than a cosmetic
flourish Elegant design, appropriate
functionality, responsive system
gt a measurable difference to their effectiveness
A system that is hard to use gt users
may fail to find important results, or
mis-interpret what they do find gt user may give
up in disgust A computer system is only as good
as the interface it provides to its users
Write a Comment
User Comments (0)
About PowerShow.com