IS214 Recap - PowerPoint PPT Presentation

About This Presentation
Title:

IS214 Recap

Description:

IS214 Recap IS214 Understanding Users and Their Work User and task analysis Ethnographic methods Site visits: observation, interviews Contextual inquiry and design ... – PowerPoint PPT presentation

Number of Views:65
Avg rating:3.0/5.0
Slides: 23
Provided by: vanh9
Category:

less

Transcript and Presenter's Notes

Title: IS214 Recap


1
IS214 Recap
2
IS214
  • Understanding Users and Their Work
  • User and task analysis
  • Ethnographic methods
  • Site visits observation, interviews
  • Contextual inquiry and design
  • Universal usability
  • Evaluating
  • Usability inspection methods including
    heuristics, guidelines
  • Surveys, interviews, focus groups
  • Usability testing
  • Server log analysis
  • Organizational and Managerial Issues
  • Ethics Managing usability

3
Methods assessing needs, evaluating
Method Needs Evaluation
User and task analysis x
Ethnographic methods x
Observation, interviews x x
Contextual inquiry design x
Universal usability x x
Usability inspectionheuristics, guidelines x x
Surveys, interviews, focus gps x x
Usability testing x
Server log analysis x
4
Intro to usability and UCD
  • Usability concepts
  • Usability as more than interface
  • Functionality, content, and design
  • User-Centered Design
  • Usability begins with design
  • At every stage in the design process, usability
    means using appropriate methods to perform
    user-based evaluation
  • Placing users (not cool technology or) at the
    center of design
  • Iterative design

5
Understanding Users and Their Work
  • To inform design evaluation

6
User and Task Analysis
  • Cant ask how good is this? without asking for
    whom and for what purpose?
  • Users
  • Selecting users whom do you need to include? How
    many?
  • Categorizing users
  • Getting peoples cooperation
  • Trust
  • Tasks
  • Identifying describing the tasks they
    (currently) perform
  • Technology design is work re-design
  • User-task matrix

7
Ethnographic methods
  • Methods and principles of social science research
    are fundamental to collecting, analyzing,
    interpreting data for needs and usability
    assessment
  • Reliability
  • Validity
  • One set of methods Ethnographic
  • Studying users in the wild
  • Learning their understanding of their work
    purposes and practices
  • Seeing how they actually do their work (as
    opposed to formal work processes)

8
Site Visits
  • Observing
  • Seeing people doing what they do, how they do it,
    under the conditions that they do it
  • Asking questions as they work
  • Tacit knowledge people may not be able to
    articulate what they do
  • Recollection people may not think to mention
    things, or not think them important
  • Interviewing
  • Getting users understandings and interpretations
  • Ability to probe
  • Interviewing skills!

9
Contextual Inquiry and Design
  • A systematic, ethnographically-based method for
  • Collecting, interpreting, and summarizing
    information about work practices and
    organizational factors
  • Incorporating findings into design
  • Structured approach to data collection,
    recording, interpretation
  • Complex requires that entire team be trained in
    it

10
Evaluating
  • A design, prototype, or working system
  • Not a clean distinction between design and
    evaluation

11
Usability inspection methods
  • A variety of methods that consist of experts (not
    users) inspecting (not using) a design,
    prototype, or system
  • Including
  • Competitive evaluation
  • Heuristic evaluation
  • Commonly-used method
  • Easy
  • Lots of information with not much investment
  • Reflects short-term use limited depth.

12
Surveys
  • Useful for collecting data directly from users
    at various stages of design and development
  • Can reach a large number of users
  • Standardized questions, answer formats easy to
    analyze
  • Issues of sample composition, sample size, and
    validity
  • Only get answers to the questions you think to
    ask
  • Question (and answer) wording affects results
  • Lack of depth and follow-up

13
Usability testing
  • Lab-based tests
  • Usually standardized tasks observed under
    controlled conditions
  • Good for getting performance data unsullied by
    variations in use conditions
  • Bad for getting performance data under real
    conditions of use (ecological validity)

14
Focus groups
  • Again, useful at many stages in process
  • In-depth information from users
  • Interaction among users helpful (or sometimes
    not)
  • Limits
  • small numbers
  • limited time period
  • effects of strong personalities or a sidetrack in
    the conversation
  • Skilled facilitator! Hard to do well, easy to
    mess up

15
Server log analysis
  • Analyzes data collected automatically
  • Large numbers
  • Unobtrusive
  • Does not rely on use cooperation or memory
  • Limits to the data available
  • Inferences must be justified by the data

16
Organizational and Managerial Issues
17
Analyzing and presenting results
  • Lots of data that has to be summarized in useful
    form
  • What is the purpose of your study?
  • What do you know? What do you need to know?
  • What recommendations can you develop from your
    data?
  • How do you present your findings succintly and
    clearly, in a way that your audience will
    understand and use?

18
Ethics
  • Do no harm to the people you are studying
  • Choices of projects?

19
Managing usability
  • How usability fits into organizations
  • We dont get no respect

20
Universal usability
  • International usability
  • Accessibility
  • Removing unnecessary barriers
  • Being aware of and designing for the variety of
    peoples capabilities
  • Incorporating multimodal information presentation
    and functionality

21
Topic we might have covered credibility
  • Larger issue when presenting content not (just)
    functionality, need to understand how people use
    and evaluate information
  • Factors that affect web site credibility
  • Source
  • Institutional, personal
  • Expertise bias or interest
  • Currency (how up to date the info is)
  • Observable factors used as indicators of
    unobservable
  • Language, (absence of) mistakes
  • Links, imprimaturs

22
Some final questions
  • How do we understand users activities, needs,
    interpretations, preferences?
  • Especially for things that dont yet exist
  • Users and uses are varied
  • People cant always articulate what we would like
    to know from them
  • The observer is not a perfectly objective tool
  • How do we translate these understandings into
    recommendations and designs?
  • How do we decide what trade-offs to make?
  • Among users (including organization vs
    individuals)
  • Between cost of design and priority of needs
Write a Comment
User Comments (0)
About PowerShow.com