User Centered Design - PowerPoint PPT Presentation

1 / 80
About This Presentation
Title:

User Centered Design

Description:

interest in the mundane', taken for granted, moment ... Engage the user in a discussion on the system, remember that they are your ... By Friday 5th November. ... – PowerPoint PPT presentation

Number of Views:14
Avg rating:3.0/5.0
Slides: 81
Provided by: marilyn78
Category:
Tags: 5th | centered | design | november | of | remember | the | user

less

Transcript and Presenter's Notes

Title: User Centered Design


1
  • User Centered Design
  • Intro to UCD
  • Explain its motivation
  • Discuss key stages in the process
  • Present basic methods and techniques

2
  • UCD is about designing interactive
  • technologies to meet users needs.
  • Different stages
  • understanding user needs
  • establishing requirements
  • prototyping alternative designs
  • evaluating designs

3
  • Key characteristics of any UCD
  • process
  • Focus on users early in the design and
    evaluation of the artefact
  • Identify, document and agree specific usability
    and user experience goals
  • Iteration is inevitable. Designers never get it
    right first time

4
  • Why involve Users
  • Around 63 of software projects exceed their cost
    estimates
  • Top four reasons are,
  • - Frequent requests for changes from users
  • - Overlooked tasks
  • - Users lack of understanding of their own
    requirements
  • - Insufficient user analysis, communication, and
    understanding

5
  • If you involve end users in the design process,
  • More likely to design/build something useful!
  • Improve productivity
  • Reduce human error
  • Reduce maintenance
  • Reduce employee turnover
  • Manage expectations
  • Encourages ownership of the solution
  • Understanding of shortcomings/tradeoffs
  • Increase satisfaction

6
  • Principles of UCD approach
  • Users behaviour and context of use are studied
    and product is designed to support them
  • Understanding user needs/pain points, as
    opportunities for design
  • User characteristics are captured and designed
    for.

7
  • Users are consulted from early concept phases,
    throughout design,
  • to final product
  • Responses to concepts, prototypes, etc are taken
    seriously
  • All design decisions are taken within the context
    of the user, their work and their environment
  • All design iterations can be traced back to user
    goals

8
  • - not just what users say, but what they do
  • - how action and interaction are achieved
  • - interest in the mundane, taken for granted,
    moment-by- moment
  • interactions of people
  • Outputs rich descriptions
  • - need interpreting through use
  • of conceptual frameworks,
  • models etc.

9
  • A range of user research methods
  • observation
  • interview
  • questionnaire
  • focus groups
  • participant analysis

10
  • Interviews
  • - Forum for talking to people
  • - Structured, unstructured or semi-structured
  • - Props, e.g. sample scenarios of use,
    prototypes, can be used in
  • interviews
  • - Good for exploring specific issues
  • - But are time consuming and may
  • be unfeasible to visit everyone

11
  • Questionnaires
  • - A series of questions designed to elicit
    specific information
  • Questions may require different kinds of answers
  • simple YES/NO
  • choice of pre-supplied answers
  • comment
  • - Often used in conjunction with other techniques
  • - Can give quantitative or qualitative data
  • - Good for answering specific
  • questions from a large, dispersed
  • group of people

12
  • Focus Groups
  • Workshops or focus groups
  • - Group interviews
  • - Good at gaining a consensus view and/or
    highlighting areas of conflict

13
  • Why establish requirements?
  • Requirements arise from
  • understanding users needs
  • Requirements should be
  • justified related to data

14
  • Establishing requirements
  • What do users want?
  • What do users need?
  • Requirements need clarification, refinement and
    completion over several iterations of the design
    process.
  • Focused problem definition established by
    analysising user data, will lead to stable list
    of requirements.

15
  • Types of Requirements
  • Users Who are they?
  • Usability and user experience qualities
  • Environment or context of use
  • Functional
  • Data

16
  • Users
  • Characteristics
  • abilities, physical, background, attitude to
    computers etc
  • System use
  • Novice step-by-step, constrained, clear
    information
  • Expert flexibility, access/power
  • Frequent short cuts
  • Casual/infrequent clear instructions, e.g. menu
    paths

17
  • Usability and User Experience Requirements
  • Effectiveness, efficiency, safety, privacy
  • utility, learnability, memorability,
  • (and fun, helpful) etc etc

18
  • Environment or Context of Use
  • Physical
  • dusty? noisy? vibration? light? heat? humidity?
    On the move? Layout of workspace?
  • Social
  • Sharing of files, of displays, in paper, across
    great distances, work individually, privacy for
    clients AND
  • Informal information distribution among users.
  • Organisational
  • hierarchy, IT departments attitude and remit,
    user support, communications structure and
    infrastructure, availability of training

19
  • Functional
  • Historically the main focus of requirements
    activities
  • What the system should do?
  • example train a new employee how to carry
    out a task.
  • And also, memory size, response time, platform
    constraints...

20
  • Data
  • What data or input is required to make the system
    function for the user and how is it accessed

21
  • Self-service cafeteria system at UL
  • Functional
  • system will calculate cost of purchases without
    the help of cafeteria staff.
  • Data
  • system must have access to price of products

22
  • Environmental
  • Most users will be carrying a tray, in a rush,
    noisy, talking/distracted, queing, etc
  • User
  • most users comfortable with technology
  • Usability
  • simple for new users,
  • memorable for frequent users,
  • quick to use, no waiting around
  • for processing

23
  • Evaluation
  • A an existing system
  • or
  • B a new design.
  • A continuous iterative process examining
  • Early prototypes of the new system
  • Later, more complete prototypes product

24
  • Looking for
  • Extent of functionality, effect of interface
    on user, specific problems/issues
  • Usability, user experience, other objectives
    e.g., performance
  • Designers need to check that they understand
    usersrequirements and meeting key objectives

25
  • When to evaluate
  • At all stages throughout design
  • From the first descriptions, sketches etc. of
    users needs through to
  • final product
  • Iterative cycles of
  • design - EVALUATE - redesign

26
  • Two main types of evaluation reflecting
    different stages and goals
  • Formative
  • Summative
  • (Involving users)
  • quick
  • usability testing
  • field studies
  • (Involving experts)
  • predictive evaluation

27
  • Quick
  • What is it
  • informally asking users/consultants for
    feedback
  • Advantages
  • Can be done any time, anywhere
  • Emphasis on fast input to the design process
  • Sense-checking, do early design ideas make
    sense?
  • early concept testing

28
  • Quick
  • Disadvantages/issues
  • Not necessarily accurate or extensive
  • No careful documentation of findings

29
  • Usability Testing
  • Recording typical users performance on typical
    tasks
  • In a controlled setting, can be in the lab or
    in the field
  • Data write-ups, video, log of key presses, time
    to complete tasks etc

30
  • Usability Testing
  • Advantages
  • Uninterrupted can assess performance, identify
    errors and help explain why users did what they
    did
  • Can be used with satisfaction questionnaires
    and interviews to elicit user opinions

31
  • Usability Testing
  • Disadvantages/issues
  • Lack of context skill to determine typical
    users and typical tasks
  • Time to set up tests, recruit participants, and
    run tests
  • Need access to resources/equipment

32
  • Field Studies
  • What is it
  • Observations and interviews in natural settings
  • Advantages
  • Helps understand what users do naturally and how
    technology impacts them in context
  • For product design
  • Identify opportunities determine design
    requirements decide how best to introduce tech
    evaluate in use

33
  • Field Studies
  • Disadvantages/issues
  • Access to settings
  • Lack of control noise, distractions,
    time-tabling etc

34
  • Think-Aloud and Cooperative Evaluation

35
  • Think-Aloud Evaluation
  • "think-aloud", is an evaluation technique in
    which the user performs a number of tasks and is
    asked to think aloud to explain what they are
    doing at each stage, and why.

36
  • The evaluator records the users actions using,
  • tape recordings
  • video
  • computer logging
  • user notes

37
  • Advantages
  • Think-aloud has the advantage of simplicity, it
    requires little expertise to perform, and can
    provide a useful insight into any problems with
    an interface.

38
  • Disadvantages
  • The information is necessarily subjective, and
    can be selective depending on the tasks chosen.
    Being observed and having to describe what you
    are doing can also affect the way in which you do
    something ask a juggler to describe how she
    juggles.....

39
  • Cooperative evaluation
  • "Cooperative evaluation" is a variant of think
    aloud, in which the user is encouraged to see
    himself as a collaborator in the evaluation
    rather than just a subject.
  • As well as getting the user to think aloud, the
    evaluator can ask such questions as "Why?" and
    "What if.....?" likewise, the user can ask the
    evaluator for clarification if problems arise.

40
  • Advantages
  • It is less constrained and therefore easier for
    the evaluator, who is not forced to sit in solemn
    silence
  • the user is encouraged to actively criticise the
    system rather than simply suffer it
  • the evaluator can clarify points of confusion so
    maximising the effectiveness of the approach.
  • Note that it is often not the designer who is
    the evaluator, but an independent person.

41
  • One of the problems with both these techniques
    is that they generate a large volume of
    information which has to be painstakingly and
    time-consumingly analysed.
  • Such a record of an evaluation session is known
    as a protocol, and there are a number to use pen
    and paper, audio and video recording, computer
    logging and user diaries.
  • eg pen and paper protocol

42
  • How to run a session
  • As an evaluator spend a few minutes thinking of
    some scenarios and tasks for the user to perform.
    Include some complex tasks as well as some simple
    ones. Decide on whether you are going to use
    think-aloud or cooperative evaluation. Then run
    the evaluation, keeping full notes of the users
    actions and behaviour and any problems. The
    fuller the detail in the protocol, the better.

43
  • Recruit Users
  • Define the Target Group
  • - Recruit users who are similar to target
    group.
  • - Describe users, background, age, sex,
    familiarity with computers etc etc

44
  • Prepare tasks
  • Are the tasks specific?
  • Will the task focus the user on the part of the
    system you are most interested in?
  • How do you know that the task you have chosen
    represents the work the product is designed to
    support.
  • Have you written the task down in a way that a
    novice user can understand?

45
  • Before the user arrives.
  • Have you got the evaluation space prepared,
    system/device, tasks, debriefing questions etc
  • Have you worked through the tasks yourself?

46
(No Transcript)
47
  • When the user arrives
  • Put users at ease.
  • Explain the co-operative design process
  • Explain the users role as collaborator CLEARLY.

48
  • While the user is using the system.
  • Keep Them Talking!!!
  • Make sure you know what they are doing and why
    they are doing it.
  • Do not contradict them.
  • Make notes.

49
  • People tend to say less when they are unsure
    what to do, but this is the time that the
    evaluator needs to know most.
  • You must encourage the user to go through
    explaining which buttons they would press and
    when, showing you what, if anything, would happen
    to the display and what they expect should
    happen.
  • (if you are using cooperative evaluation, you
    can discuss it with them, but for think aloud you
    have to just accept what is presented).

50
  • Debriefing the Users..
  • Ask the users what is the best and worst feature.
  • Engage the user in a discussion on the system,
    remember that they are your collaborator in
    evaluating this system.
  • Ask the users what they though of the evaluation
    session.

51
  • Summerise your observations
  • Corralate your notes, the evaluation team should
    do this together.
  • Focus on unexpected behaviour and comments on the
    usability of the system.
  • Think of users problems as symtoms of bad design.
  • Which problems should be fixed?
  • Make recommendations.

52
  • Real World Situation
  • Sometimes it is neccesary to set up evaluations
    in a similar context to that in which they are
    designed to be used. Eg mobile technology?
  • Where appropriate, involve two or three users in
    one session. If this reflects a real world
    situation.
  • Eg watching a DVD?

53
  • Project
  • Four persons per group
  • Due date WK 9
  • (Monday 22nd Nov. 5pm)
  • An Evaluation Report
  • 5000-6000 words
  • 40 of overall module mark.

54
  • Submit Group Names and Topic to me by email,
    (once per group)
  • Subject Heading
  • Co-Operative Evaluation
  • Marilyn.lennon_at_ul.ie
  • By Friday 5th November.
  • Anyone without a group please attend the tutorial
    on Thursday 4th Nov at 3pm. SG20

55
  • Co-Operative Evaluation of an Interactive System
    or Object
  • Write a report outlining
  • 3 evaluation sessions, evaluating one system.
  • Describe
  • The System or Object to be evaluated
  • The evaluation technique
  • The Users
  • The Session Set-up
  • The Tasks
  • The Observations
  • The Recommendations.

56
  • Included in the Appendix
  • I want a brief outline form each student on
    their role in the evaluation sessions and
    write-up.
  • An example of the material from an evaluation

57
  • Try to do evaluate something relevant to your
    FYP!!!!!
  • (just a suggestion!)

58
  • PLEASE READ THE EXAMPLE SUPPLIED BEFORE STARTING
    YOUR PROJECT.
  • Nokia N-Gage Game Deck posted at
  • http//richie.idc.ul.ie/hci2004/

59
(No Transcript)
60
(No Transcript)
61
(No Transcript)
62
(No Transcript)
63
(No Transcript)
64
(No Transcript)
65
(No Transcript)
66
(No Transcript)
67
(No Transcript)
68
(No Transcript)
69
(No Transcript)
70
(No Transcript)
71
(No Transcript)
72
(No Transcript)
73
(No Transcript)
74
(No Transcript)
75
(No Transcript)
76
(No Transcript)
77
(No Transcript)
78
(No Transcript)
79
(No Transcript)
80
(No Transcript)
Write a Comment
User Comments (0)
About PowerShow.com