Heuristic Evaluation

About This Presentation
Title:

Heuristic Evaluation

Description:

Consistency. Feedback. Clearly Marked Exits. Shortcuts. Good Error ... Consistency. Ensure that the same action always has the same effect (or, avoid modes) ... – PowerPoint PPT presentation

Number of Views:94
Avg rating:3.0/5.0

less

Transcript and Presenter's Notes

Title: Heuristic Evaluation


1
Heuristic Evaluation
213 User Interface Design and Development
  • Professor Tapan Parikh (parikh_at_berkeley.edu)
  • TA Eun Kyoung Choe (eunky_at_ischool.berkeley.edu)
  • Lecture 7 - February 14th, 2008

2
Todays Outline
  • Nielsens Heuristics
  • Heuristic Evaluation
  • Severity and Fixability
  • Example

3
Heuristic Evaluation
  • A cheap and effective way to find usability
    problems
  • A small set of expert evaluators examine the
    interface and judge its compliance with
    recognized usability principles
  • Discount usability testing - find problems
    earlier and relatively cheaply, without involving
    real users

4
What Heuristics?
  • Your readings provide a number of high-level and
    low-level design guidelines
  • Donald Norman, Design of Everyday Things
  • Jeff Johnson, GUI Bloopers
  • Jakob Nielsen, Usability Engineering
  • Other heuristics can be provided by your own
    intuition and common sense
  • We will focus on Nielsens list from Usability
    Engineering

5
Nielsens Heuristics
  • Simple and Natural Dialog
  • Speak the Users Language
  • Minimize User Memory Load
  • Consistency
  • Feedback
  • Clearly Marked Exits
  • Shortcuts
  • Good Error Messages
  • Prevent Errors
  • Help and Documentation

6
Simple and Natural Dialog
  • Match the users task
  • Minimize navigation
  • Present exactly the information the user needs,
    when she needs it
  • Use good graphic design
  • Less is more

7
Adapted from Saul Greenberg
8
Speak the Users Language
  • Use the same terms the user would
  • Avoid unusual word meanings
  • Support synonyms and aliases
  • Dont impose naming conventions
  • Understand users and how they view their domain

9
Adapted from Jake Wobbrock
10
Minimize User Memory Load
  • Recognize rather then Recall
  • Edit rather then Enter
  • Choose rather then Input
  • Provide a small number of basic commands

11
Adapted from Saul Greenberg
12
Adapted from Saul Greenberg
13
Consistency
  • Ensure that the same action always has the same
    effect (or, avoid modes)
  • Present the same information in the same location
  • Follow established standards and conventions

14
Adapted from Saul Greenberg
15
Provide Feedback
  • Continuously inform the user about what is going
    on
  • Restate and rephrase user input
  • Provide warnings for irreversible actions
  • Give informative feedback even if the system fails

16
What mode am I in now?
What did I select?
How is the system interpreting my actions?
Adapted from Saul Greenberg
17
Response Times
  • 0.1 second - perceived as instantaneous
  • 1 second - users flow of thought stays
    uninterrupted, but delay noticed
  • 10 seconds - limit for keeping users attention
    focused on the dialog
  • gt10 seconds - user will want to perform other
    tasks while waiting

18
Waiting
  • Provide a progress indicator for any operation
    longer then ten seconds
  • Reassure the user system hasnt crashed
  • Indicate how long user has to wait
  • Provide something to look at
  • If cant provide specific progress, use generic
    working indicator
  • spinning ball in Mac OS X

19
Adapted from Saul Greenberg
20
Clearly Marked Exits
  • Dont trap the user
  • Provide an easy way out of trouble
  • Encourage exploratory learning
  • Mechanisms
  • Cancel
  • Undo, Revert, Back
  • Interrupt
  • Exit

21
Adapted from Saul Greenberg
22
Adapted from Jake Wobbrock
23
Shortcuts
  • Allow expert users to go fast
  • Avoid GUI operations
  • Mechanisms
  • Keyboard shortcuts
  • Macros, scripts
  • Type ahead
  • Bookmarks, History

24
Keyboard accelerators for menus
Customizable toolbars andpalettes for frequent
actions
Split menu, with recently used fonts on top
Double-click raises toolbar dialog box
Double-click raises object-specific menu
Scrolling controls for page-sized increments
Adapted from Saul Greenberg
25
Good Error Messages
  • Phrased in clear language
  • Avoid obscure codes
  • Precisely indicate the problem
  • Restate user input
  • Do not blame the user
  • Constructively suggest a solution
  • Opportunity to help user in time of need

26
Adapted from Jake Wobbrock
27
Adapted from Jake Wobbrock
28
Prevent Errors
  • Bounds-checking
  • Select rather then Enter
  • Judicious use of confirmation screens
  • Avoid modes, unless they are clearly visible or
    require action to maintain

29
Adapted from Saul Greenberg
30
Adapted from Saul Greenberg
31
Adapted from Jake Wobbrock
32
Help and Documentation
  • Easy to search
  • Task-oriented
  • List concrete steps
  • Provide context-specific help
  • Shouldnt be too large
  • Is not a substitute for good design

33
Kinds of Help
  • Tour / Demo
  • Tutorials
  • User Guide / Reference manual
  • Searchable index
  • Tooltips, Balloon Help
  • Reference cards
  • Keyboard templates

Adapted from Saul Greenberg
34
Conducting a Heuristic Evaluation
  • Can use hi-fi or lo-fi prototype
  • Each session should last 1-2 hours
  • Evaluator should go through the interface several
    times, with specific tasks in mind
  • First pass overall feel and scope, identify
    obvious violations
  • Second pass focus on specific elements

35
Conducting a Heuristic Evaluation
  • 3-5 evaluators are enough to uncover most
    important problems
  • Each evaluator should inspect the interface alone
    (to reduce bias)
  • After the session, the evaluators aggregate
    observations
  • Output is a list of usability problems

36
Conducting a Heuristic Evaluation
  • If the system is intended to be walk up and
    use, then evaluators should be provided with
    minimal help
  • If the system requires training, then evaluators
    should be trained and given an example scenario
  • User can be helped after they have made an
    attempt and articulated their difficulties

37
Steps in Heuristic Evaluation
  • Pre-evaluation training
  • Evaluation
  • Severity / Fixability rating
  • Debriefing

38
Severity Ratings
  • Provided by each evaluator
  • Based on frequency, impact, persistence
  • Combined into a single numeric index
  • Average taken across evaluators
  • Allows for prioritization of fixes

39
Severity Ratings
  • 0 dont agree that this is a problem
  • 1 cosmetic problem
  • 2 minor problem
  • 3 major problem important to fix
  • 4 catastrophe imperative to fix

40
Debriefing
  • Conducted with evaluators, observers, and
    development team members
  • Discuss general characteristics of UI
  • Suggest improvements to address major usability
    problems
  • Dev team provides fixability ratings
  • Make it a brainstorming session

Adapted from Saul Greenberg
41
Fixability
  • Describes how easy each problem would be to fix
  • Requires some technical knowledge of the system
  • With severity, allows for estimating
    cost-benefit
  • Evaluators also provide possible fix as guidance
    to development team

42
Fixability
  • 0 Impossible to Fix
  • 1 Nearly Impossible to Fix
  • 2 Difficult to Fix
  • 3 Easy to Fix
  • 4 Trivial to Fix

43
Output
  • A list of problems with heuristics, severity,
    fixability and possible fixes

Adapted from Jake Wobbrock
44
For Next Time
  • Start working on Assignment 2!
  • Reading about Usability Testing
Write a Comment
User Comments (0)