Evaluating Interfaces - PowerPoint PPT Presentation

About This Presentation
Title:

Evaluating Interfaces

Description:

Evaluating Interfaces Goals of evaluation Lab versus field based evaluation Evaluation methods Design-oriented Implemented-oriented The goals of evaluation? – PowerPoint PPT presentation

Number of Views:125
Avg rating:3.0/5.0
Slides: 24
Provided by: TheGhost
Category:

less

Transcript and Presenter's Notes

Title: Evaluating Interfaces


1
Evaluating Interfaces
  • Goals of evaluation
  • Lab versus field based evaluation
  • Evaluation methods
  • Design-oriented
  • Implemented-oriented

2
The goals of evaluation?
  • To ensure that the interface behaves as we expect
    and meets user needs
  • Assess the extent of its functionality
  • Assess its impact on the user
  • Identify specific problems
  • Assess the usability of the interface

3
Laboratory studies versus field studies
  • Laboratory studies
  • The user comes to the evaluator
  • Well-equipped laboratory may contain
    sophisticated recording facilities, two-way
    mirrors, instrumented computers etc.
  • Can control or deliberately manipulate the
    context of use
  • The only option for some dangerous or extreme
    interfaces
  • But cannot reproduce the natural working context
    of a users environment, especially social
    interaction and contingencies
  • Difficult to evaluate long-term use

4
  • Field studies
  • The evaluator goes to the user
  • Captures actual context
  • Captures real working practice and social
    interaction
  • Not possible for some applications
  • It can also be difficult to capture data
  • Cannot prove specific hypotheses

5
Different kinds are appropriate at different
stages of design
Early-on formative evaluation of the design
may only involve designers and other experts
Later-on evaluation of the implementation
detailed, rigorous and with end-user
6
Evaluation Methods
  • Design-oriented evaluation methods
  • Cognitive walkthrough
  • Heuristic/expert inspections
  • Theory and literature review
  • Implementation-oriented methods
  • Observation
  • Controlled experiments
  • Query techniques interviews and surveys

7
Cognitive Walkthrough
  • A predictive technique in which designers and
    possibly experts simulate the users
    problem-solving process at each step of the
    human-computer dialogue
  • Originated in code walkthrough from software
    engineering
  • Used mainly to consider ease of learning issues
    especially how users might learn by exploring
    the interface

8
Cognitive Walkthrough The Stages
  • Begins with
  • A detailed description of the prototype (e.g.,
    menu layouts)
  • Description of typical tasks the user will
    perform
  • A written list of the actions required to
    complete the tasks with the prototype
  • An indication of who the users are and what kind
    of experience and knowledge they may have

9
  • For each task, evaluators step through the
    necessary action sequences, imagining that they
    are a new user and asking the following
    questions
  • Will the user know what to do next?
  • Can the user see how to do it?
  • Will they know that they have done the right
    thing?
  • It is vital to document the walkthrough
  • Who did what and when
  • Problems that arose and severity ratings
  • Possible solutions

10
A short fragment of cognitive walkthrough
  • Evaluating the interface to a personal desktop
    photocopier
  • A design sketch shows a numeric keypad, a "Copy"
    button, and a push button on the back to turn on
    the power.
  • The specification says the machine automatically
    turns itself off after 5 minutes inactivity.
  • The task is to copy a single page, and the user
    could be any office worker.
  • The actions the user needs to perform are to turn
    on the power, put the original on the machine,
    and press the "Copy" button
  • Now tell a believable story about the user's
    motivation and interaction at each action
  • From Philip Craiger's page at http//istsvr03.unom
    aha.edu/gui/cognitiv.htm


11
The user wants to make a copy and knows that the
machine has to be turned on. So they push the
power button. Then they go on to the next action.
But this story isn't very believable. We can
agree that the user's general knowledge of office
machines will make them think the machine needs
to be turned on, just as they will know it should
be plugged in. But why shouldn't they assume that
the machine is already on? The interface
description didn't specify a "power on"
indicator. And the user's background knowledge is
likely to suggest that the machine is normally
on, like it is in most offices. Even if the
user figures out that the machine is off, can
they find the power switch? It's on the back, and
if the machine is on the user's desk, they can't
see it without getting up. The switch doesn't
have any label, and it's not the kind of switch
that usually turns on office equipment (a rocker
switch is more common). The conclusion of this
single-action story leaves something to be
desired as well. Once the button is pushed, how
does the user know the machine is on? Does a fan
start up that they can hear? If nothing happens,
they may decide this isn't the power switch and
look for one somewhere else.
12
Heuristic/Expert Inspections
  • Experts assess the usability of an interface
    guided by usability principles and guidelines
    (heuristics)
  • Jacob Nielsen suggest that 5 experts may enough
    to uncover 75 of usability problems
  • Best suited to early design and when there is
    some kind of representation of the system e.g.,
    storyboard
  • Its only as good as the experts and you need
    experts in the problem domain and usability

13
The Process of Heuristic Expert Inspections
  • Briefing session
  • Experts all given identical description of
    product, its context of use ad goals of
    evaluation
  • Evaluation period
  • Each experts spends several hours independently
    critiquing the interface
  • At least two passes through the interface, one
    for overall appreciation and others for detailed
    assessment
  • Debriefing session
  • Experts meet to compare findings, prioritise
    problems and propose solutions
  • They report/present their findings to decision
    makers and other stakeholders

14
Theory and literature review
  • We have seen before that you can apply existing
    theory to evaluate a design
  • The Keystroke Level Model
  • Fits law
  • HCI and experimental psychology already contain a
    wealth of knowledge about how people interact
    with computers
  • Scour the literature (ACM Digital Library,
    Google, Citeseer and others)
  • But think carefully about whether the results
    transfer

15
Observation
  • Observe users interacting with the interface in
    the laboratory or field
  • Record interactions using
  • Pen and paper
  • Audio
  • Video
  • Computer logging
  • User notebooks and diaries
  • Think-aloud techniques

16
  • Analysis
  • Illustrative fragments
  • Detailed transcription and coding
  • Post-task walkthroughs
  • Specialised analysis software can replay video
    along system data and help the analyst
    synchronise notes and data

17
Savannah
  • An educational game for six players at a time
  • A virtual savannah is overlaid on an empty school
    playing field

18
Studying Savannah
  • Six trials over three days
  • Two video recordings from the field
  • Game replay interface

19
Impala Sequence
20
The Impala Sequence Revealed
  • Elsa suddenly stops
  • Circular formation
  • Counting aloud
  • Nala and Elsa cannot see the impala
  • Replay shows them stopped on edge of locale
  • GPS drift carries them over the boundary
  • The boy who passed through attacked first

21
Controlled Experiments
22
Query Techniques
  • Elicit the users view of the system
  • Can address large numbers of users
  • Key techniques are
  • Interviews
  • surveys
  • Relatively simple and cheap to administer
  • But not so detailed or good for exploring
    alternative designs

23
DECIDE A Framework to Guide Evaluation (Preece,
et al, 2001)
  • Determine the overall goals that the evaluation
    addresses
  • Explore the specific questions to be answered
  • Chose the evaluation approach and specific
    techniques to answer these questions
  • Identify the practical issues that must be
    addressed
  • Decide how to deal with ethical issues
  • Evaluate, interpret and present the date
Write a Comment
User Comments (0)
About PowerShow.com