Title: Chapter 13: Asking Users Asking Experts
1Chapter 13Asking UsersAsking Experts
2Asking users and experts the techniques
- Asking users
- Interviews
- questionnaires
-
- Asking experts
- Heuristics
- Cognitive walkthrough
-
3Asking users Interviews
- Unstructured, Structured, or semi-structured
Which approach? - Depends on the evaluation goals and the questions
to be addressed. e.g. if the goal is to gain
first impressions about how users react to a new
idea, then an informal, open-ended interview is
often the approach. But if the goal is to get
feedback about a particular design feature, such
as layout of a new web browser, then a structured
interview is often better. - Remember the DECIDE framework
- Goals and questions guide all interviews
4Asking users Interviews
- Unstructured the evaluator will have a list of
questions but will pursue certain threads of
discussion depending on the response of the user. - Structured the evaluator will have a fixed list
of questions which the user must answer - Semi-structured - guided by a list of questions
but interesting issues can be explored in more
depth.
5Things to avoid when preparing interview questions
- Long questions
- Compound sentences - split into two
- Jargon language that the interviewee may not
understand - Leading questions that make assumptions e.g., why
do you like ? - Unconscious biases e.g., gender stereotypes
6Asking users Group interviews
- Also known as focus groups
- Typically 3-10 participants
- Provide a diverse range of opinions
- Need to be managed to- ensure everyone
contributes- discussion isnt dominated by one
person- the agenda of topics is covered - Can be used as a debriefing session after the
users have had some exposure to the system being
evaluated. - The facilitator has to prepare a list of topics
for discussion. The topics should be introduced
one at a time.
7Asking users Group interviews
- e.g. topic 1 how did they get started with the
first task? - Topic 2 were they able to complete the first
task, and if not, what difficulties did they
have? - Such open questions should trigger the group to
discuss problems and share experiences. - Focus groups will give the designer insights into
how users think and what things are important to
them. It is difficult to achieve this with other
technique.
8Asking users - Questionnaires
- Questionnaires need to be carefully designed.
- Types of questions vary between questionnaires.
It depends according to the goals and the
questions. - Questionnaire format can include- yes, no
checkboxes- checkboxes that offer many options-
Likert rating scales- open-ended responses - Pilot test questions - are they clear, is there
sufficient space for responses - Decide how data will be analyzed consult a
statistician if necessary
9Asking users- web-based questionnaires
- Can include check boxes, pull-down and pop-up
menus, help screens and graphics. - Can provide immediate data validation
- Can enforce rules such as select only one
response, or certain types of answers such as
numerical, which cannot be done in email or
paper.
10Asking users- web-based questionnaires
- Other advantages of online questionnaires
- Responses are usually received quickly
- No copying and postage costs
- Data can be collected in database for analysis
- Time required for data analysis is reduced
- Errors can be corrected easily
- Devise the questionnaire on paper first, only
once the questionnaire has been reviewed and the
questions refined adequately, then translate it
to web-based version.
11Problems with web-based questionnaires
- Preventing individuals from responding more than
once
12Questionnaire Tools
- SUMI (Software Usability Measurement inventory)
- MUMMS (Measuring the Usability of Multi-Media
Systems) - QUIS
- http//www.ucc.ie/hfrg/questionnaires
13Asking experts
- Experts use their knowledge of users technology
to review software usability - Heuristic evaluation is a review guided by a set
of heuristics to evaluate whether user interface
elements such as dialog boxes, menus, navigation
structure, etc conform to the principles - Walkthroughs involve stepping through a
pre-planned scenario noting potential problems
that will be faced by the users
14Heuristic evaluation
- Developed Jacob Nielsen in the early 1990s
- The original set of heuristics was derived from
an analysis of 249 usability problems - However, some of these core heuristics are too
general for evaluating new products coming onto
the market - There is a strong need for heuristics that are
more closely tailored to specific products
15Heuristic evaluation
- Different sets of heuristics for evaluating toys,
WAP devices, online communities, etc are needed. - Evaluators must develop their own by tailoring
Nielsens heuristics and by referring to design
guidelines, market research, and requirements
document. - Expanding the heuristics to include some of the
questions addressed when doing evaluation
163 stages for doing heuristic evaluation
- Briefing session to tell experts what to do
- Evaluation period of 1-2 hours in which- Each
expert works separately- Take one pass to get a
feel for the product- Take a second pass to
focus on specific features - Debriefing session in which experts work together
to prioritize problems
17Problems of heuristic evaluation
- Can be difficult expensive to find experts
18Cognitive walkthroughs
- Walkthroughs are an alternative approach to
heuristic evaluation for predicting users
problems without doing user testing. - They involve walking through a task with the
system and noting problematic usability features.
- Cognitive walkthrough involve simulating a
users problem-solving process at each step in
the human-computer dialog, checking to see if the
users goals and memory for actions can be
assumed to lead to the next correct action
(Nielsen and Mack, 1994). - They focus on evaluating designs for ease of
learning
19Cognitive walkthroughs
- Steps involved
- The characteristics of typical users are
documented and sample tasks are developed that
focus on the aspects of the design to be
evaluated - A prototype of the interface is produced, along
with a clear sequence of the actions needed for
the users to complete the task - The evaluators walk through the action sequences
for each task, placing it within the context of a
typical scenario and as they do this they try to
answer the three questions
20The 3 questions
- Will the user know what to do to achieve the
task? - Will the user notice that the correct action is
available? (Will users see how to do it?) - Will the user know from the feedback that they
have made correct or incorrect choice of action? - As the experts work through the scenario they
- note problems
- The design is then revised to fix the problems
21Example of cognitive walkthrough
- A cognitive walkthrough of buying a book at
Amazon.com - Task to buy a copy of an interaction design
book from Amazon.com - Typical users students who use the web regularly
- The steps to complete the task are given below
- Step 1. Selecting the correct category of goods
on the home page - Q. Will users know what to do?
- Answer yes they know that they must find
books - Q. Will users see how to do it?
- Answer yes they have seen menus before and
will know to select the appropriate item and
click go. - Q. Will users understand from feedback whether
the action was correct or not? - Answer yes their action takes them to a form
that they need to complete to search for the book
22Example of cognitive walkthrough
- Step 2. Completing the form
- Q. Will users know what to do?
- Answer yes the online form is like a paper so
they know they have to complete it - etc