User Interface Design

1 / 50
About This Presentation
Title:

User Interface Design

Description:

... user task performance. Low user error rate. Subjective user satisfaction ... Rate of errors by user. Retention over time. Subjective satisfaction. 8/31/09. 38 ... – PowerPoint PPT presentation

Number of Views:224
Avg rating:3.0/5.0
Slides: 51
Provided by: MarkGu

less

Transcript and Presenter's Notes

Title: User Interface Design


1
User Interface Design Evaluation
Darn these hooves! I hit the wrong switch
again! Who designs these instrument panels,
raccoons?!
2
Administrivia
  • M3 (Oct 7)
  • Need to demo for TAs next week!
  • Design Roundtable (Oct 7)
  • Volunteers? Email me or TAs
  • We may solicit volunteers if we think your design
    is interesting
  • Midterm Exams

3
Two Challenges of UI
  • Create user interface software that you can
    maintain well, that is truly object-oriented, and
    that allows changes to parts without impacting
    everything
  • Create user interfaces that people can actually
    use
  • The latter is much harder and is what well
    discuss today
  • Briefly! No substitute for a real HCI class

4
Goal Usability
  • Combination of
  • Ease of learning
  • High speed of user task performance
  • Low user error rate
  • Subjective user satisfaction
  • User retention over time

5
UI Design The Good
6
The Bad
7
The Bad
8
The Ugly
9
The (really) Ugly
10
Outline
  • Design
  • Know thy users
  • Principles
  • User Centered Design A UI Design Process
  • User Testing / Evaluation
  • Testing without users
  • Testing with users
  • Testing with groups

11
Know Thy Users! (For they are not you)
  • Physical cognitive abilities ( special needs)
  • Personality culture
  • Knowledge skills
  • Motivation
  • Two Fatal Mistakes
  • Assume all users are alike
  • Assume all users are like the designer

You Are Here
12
Designers vs. Users
  • I and my friends can use it, so it must be easy
    to use
  • The user may not be anything like you
  • Making an interface that works for you may not
    work at all for your users
  • Most users are not programmers
  • Most users have expertise, but not in computers
  • You probably dont have their expertise!
  • They may think about things very different than
    you!

13
A Simple Exercise
  • The Shoe Store..

14
Design Principles
  • Intended to prevent many bad designs before they
    begin
  • Guidelines based on previous designs,
    experimental findings
  • Rules can all be broken (but usually in order
    to satisfy another principle)
  • These are only a subset

15
Predictability
  • I think that this action will do.
  • Operation visibilityCan see avail. actions
  • e.g. menus vs. command shell
  • grayed menu items

16
Familiarity
  • Does UI task leverage existing real-world or
    domain knowledge?
  • Really relevant to first impressions
  • Use of metaphors
  • Potential pitfalls
  • Are there limitations on familiarity?

17
Generalizability
  • Can knowledge of one system/UI be extended to
    other similar ones?
  • Example cut paste in different applications
  • Does knowledge of one aspect of a UI apply to
    rest of the UI?
  • Aid UI Developers guidelines

18
Observability
  • Can user determine internal state of system from
    what she perceives?
  • Problems with Modes
  • Browsability
  • Explore current state (without changing it)
  • Reachability
  • Navigate through observable states
  • Persistence
  • How long does observable state persist?

19
Recoverability
  • Ability to take corrective action upon
    recognizing error
  • Difficulty of recovery procedure should relate to
    difficulty of original task
  • Forward recovery
  • Ability to fix when we cant undo
  • Backward recovery
  • Undo previous error(s)

20
Responsiveness
  • Users perception of rate of communication with
    system
  • Response time
  • Time for system to respond in some way to user
    action(s)
  • Consistency important
  • Why the Internet is poorly designed
  • bandwidth vs. latency

21
Outline
  • Design
  • Know thy users
  • Principles
  • User Centered Design A Design Process
  • User Testing / Evaluation
  • Testing without users
  • Testing with users
  • Testing with groups

22
User-Centered Design
  • A way to force yourself to identify and consider
    the relevant human factors in your design
  • Helps reduce the number of decisions made out of
    the blue, and helps focus design activities
  • Helps document and defend decisions that may be
    reviewed later

23
The Tao of UCD
DESIGN
IMPLEMENT
USE EVALUATE
24
UCD 9 Step Overview
  • Define the Context
  • Describe the User
  • Task Analysis
  • Function Allocation
  • System Layout / Basic Design
  • Mockups Prototypes
  • Usability Testing
  • Iterative Test Redesign
  • Updates Maintenance

25
Design Implications
  • At each stage, consider how the details of your
    discovery process affect your design

26
1. Define the Context
  • Context the type of uses, applications
  • Life critical systems, applications
  • Industrial, commercial, military, scientific,
    consumer
  • Office, home, entertainment
  • Exploratory, creative, cooperative
  • Market
  • Customer (not the same as the User)

27
2. Describe the User
  • Physical attributes(age, gender, size, reach,
    visual angles, etc)
  • Physical work places(table height, sound levels,
    lighting, software version)
  • Perceptual abilities(hearing, vision, heat
    sensitivity)
  • Cognitive abilities(memory span, reading level,
    musical training, math)
  • Personality and social traits(likes, dislikes,
    preferences, patience)
  • Cultural and international diversity(languages,
    dialog box flow, symbols)
  • Special populations, (dis)abilities

28
3. Task Analysis
  • Talk to and observe users (NOT customers) doing
    what they do
  • List each and every TASK
  • Break tasks down into STEPS
  • ABSTRACT into standard tasks(monitor, diagnose,
    predict, control, inspect, transmit, receive,
    decide, calculate, store, choose, operate, etc.)

29
4. Function Allocation
  • Consider the whole system!
  • Decide who or what is best suited to perform each
    task (or each step)
  • e.g., system remembers login id, and reminds the
    user, but user remembers the password
  • Base this on knowledge of system hardware,
    software, human users abilities, culture,
    communications protocols, privacy, etc.
  • Allocation constraints Effectiveness
    Cognitive/affective Cost Mandatory

30
5. System Layout / Basic Design
  • Summary of the components and their basic design
  • Cross-check with any Requirements Documents
    Human Factors refs Hardware specs Budgets Laws
    (ADA) etc.
  • Ensure that the system will support the design
    and comply with constraints
  • (Verification and Validation, in the language of
    software engineering)

31
6. Mockups Prototypes
  • Informed Brainstorming
  • RAPIDLY mock up the user interfaces for testing
    with real people
  • Pen and paper or whiteboard to start
  • Iterate, iterate, iterate!!
  • Increasingly functional veridical
  • List audio visual details at same levels of
    detail in the prototypes
  • (i.e. dont forget either of them)

32
7. Usability Testing
  • Get real (or representative) users to do what
    they do, using the prototypes
  • Subjective and objective feedback. Sometimes
    users want features that actually yield poor
    performance
  • Video tape, lots of notes
  • Be rigorous wherever possible (stats, etc.)
  • Feedback into the iterative evaluation redesign
    of the system
  • Discount usability testing can be very
    effective, using fewer subjects, more rapid
    results

33
8. Iterative Test Redesign
  • Repeat cycles of testing and reworking the
    system, subject to cost/time constraints
  • Plan for several versions during development

34
9. Updates Maintenance
  • In-the-field feedback, user data, logs, surveys,
    etc.
  • Analyze and make iterative redesign/test
    recommendations
  • Updates and maintenance plan as part of the
    design!
  • (design it so it can be fixed or updated)

35
Outline
  • Design
  • Know thy users
  • Principles
  • User Centered Design A Design Process
  • User Testing / Evaluation
  • Testing without users
  • Testing with users
  • Testing with groups

36
How do you decide between interface alternatives?
ClockMorph
Ch5 ClockUI
37
User Testing / Evaluation
  • How to choose between interface alternatives,
    improve an existing design
  • Both subjective and objective metrics
  • Some things we can measure
  • Time to learn
  • Speed of performance
  • Rate of errors by user
  • Retention over time
  • Subjective satisfaction

38
Evaluating Without Users
  • Heuristic evaluation
  • Carefully analyzing a user interface in terms of
    a set of standard questions or issues
  • E.g., Is the necessary knowledge always
    visible?
  • Review guidelines (consistency)
  • Using a standard for UI guidelines (e.g., Apple,
    IBM, etc.) determine if meets guidelines
  • E.g., Is the Cancel button always in the right
    place?
  • Cognitive walkthrough
  • Will the system make sense to the user?

39
Sample Useful Heuristics
  • Can the user figure out their current state?
  • Is everything that the user needs to know about
    visible on the screen?
  • Is the language on the screen in the language of
    the user (not the programmer)?
  • Is help available?
  • Are error messages useful?

40
Cognitive Walkthrough
  • Start out with a description of the system, the
    user, the users goals, and a process description
    for a useful task
  • Walk the process description asking at each step
  • Does this make sense for this user? Will they
    understand it?
  • Will the user be able to figure out what to do
    next?
  • Will the users be able to understand the feedback
    that they get?
  • If something goes wrong, can they tell? Can they
    figure out how to fix it?

41
Evaluating with Users
  • In the end, you can never really be sure that
    your interface works until you face the users
  • Its always better (less expensive!) to identify
    and correct errors earlier rather than later, so
    pre-user methods are useful
  • But real users have ways of interpreting things,
    doing things, and breaking things that you might
    never expect.
  • Know they users for they are not you!
  • Examples

42
Deciding How to Evaluate with Users
  • What do you want to learn?
  • Goal How can I improve my UI?
  • Observation
  • Questionnaires
  • Goal Is my software better than X
    (Xcompetitors software, doing it by hand)?
  • You need enough users to conduct a real
    experiment
  • You need an experiment and control group
  • Have them perform some standard set of tasks and
    measure time, accuracy, number of errors, etc.
  • Beyond the scope of this lecture

43
Observing Users
  • Yogi Berra You can see a lot just by watching!
  • Observation can buy you a lot
  • But is usually not enough Why are you doing
    that same error-generating thing
    over-and-over?!?
  • Think-alouds
  • Get the users to say out loud what theyre
    seeing, what theyre doing, and why theyre doing
    it while theyre doing it
  • Important that designers dont interfere!
  • Im sorry I made this so hard to use, but

44
User Testing / Evaluation is Actually a Misnomer
  • You test / evaluate software, not users
  • Tell the user you need their help to make the
    software better
  • Users are inherently nice they dont want to
    hurt your feelings by pointing out flaws
  • Youll make them more comfortable
  • Encourages more exploration, more freedom to
    critique, and more openness

45
Questionnaires
  • Cant always observe
  • Natural use is probably not in a laboratory
    setting
  • Your software may only work with lots of people
    using
  • May want to state something more rigorously
  • Alternative Questionnaire
  • Phrase the questions to get at what you wantand
    what you dont know yet
  • Best 3, Worst 3

46
Designing questionnaires
  • Multiple choice and scalar questions (on a scale
    of 1 to 5 where) are easier to analyze
  • Ask the same thing in multiple ways
  • Determine inconsistencies that may identify
    confusion
  • May find that there are subtleties you were
    missing earlier
  • Ask for volunteers for follow-up interviews

47
Evaluating Groupware
  • With the Web growth, more-and-more software
    designed for use in groups
  • Consider an auction site as an example
  • Groupware is especially hard to evaluate
  • Users may be distributed
  • Different users have different roles Buyers,
    sellers, brokers, admins
  • Success may depend on context, culture
  • Often, goals more than just usability
  • E.g. social agenda as well as usability agenda

48
One Approach Logging
  • If you cant observe, get the software to do it
    for you
  • Record all the users actions in a file (with
    permission!)
  • Analyze after the fact
  • What errors occurred often?
  • What were the common activities?
  • Problem can only gather what they did, not why
    they did it

49
Summary
  • Design
  • Know thy users
  • Principles
  • User Centered Design A Design Process
  • User Testing / Evaluation
  • Testing without users
  • Testing with users
  • Testing with groups

50
Curious to Learn More?
  • Take CS 4750 User Interface Design
  • Ill be teaching a section this spring
Write a Comment
User Comments (0)