Software Usability Course notes for CSI 5122 - University of Ottawa PowerPoint PPT Presentation

presentation player overlay
1 / 54
About This Presentation
Transcript and Presenter's Notes

Title: Software Usability Course notes for CSI 5122 - University of Ottawa


1
Software UsabilityCourse notes for CSI 5122 -
University of Ottawa
  • Section 4
  • Usability in the Software Engineering Process
  • Timothy C. Lethbridge
  • lttcl_at_eecs.uottawa.cagt
  • http//www.eecs.uottawa.ca/tcl/csi5122

2
I assert Lack of usability is the most critical
problem facing software engineering
  • In other words, the biggest gains in
  • software quality
  • productivity
  • cost-reduction
  • user satisfaction
  • profitability, etc.
  • would come from focusing on usability and
    related issues.

3
Evidence for the assertion
  • Failures of projects due in significant part to
    usability
  • E.g. FAA air traffic control systems
  • Observations of great software vs.
    not-so-great with respect to usability
  • Great
  • Google (many of its products, but some getting
    worse)
  • Microsoft Office (some of it)
  • Mac-OS (much of it, but weaknesses creep in)
  • Not-so-great
  • Much in-house custom software
  • Software development environments (many aspects)
  • Web sites (far too many of them)
  • I have observed many users formally and
    informally
  • My survey of practitioners (next slide)
  • Analysis of benefits and cost savings

4
Practitioner survey about 75 topics taught to
computer science students
  • Lethbridge, T.C. (2000), What Knowledge is
    Important to a Software Engineer?, IEEE
    Computer, May, pp. 44-50.

5
More on the survey
  • HCI was second in terms of knowledge gap
  • Where importance most exceeds current knowledge
  • The top 5 (out of 75)
  • Negotiation
  • HCI/user interfaces
  • Leadership
  • Real-time system design
  • Management
  • I believe not much has changed since 1998

6
Sources of resistance to adopting usability
practices
  • Primarily Inadequate education of students,
    practitioners and managers
  • Which leads to
  • Lack of introspection ability of software
    developers
  • Unable to think like users
  • Lack of integration of usability into core
    development processes
  • Persistent beliefs
  • Most software is usable enough
  • Usability can be fixed at the end
  • Usability should be left to the HCI experts

7
Causes of the problems in education
  • Faculty dont have the background either
  • The field of HCI is seen as distinct and separate
  • The field is perceived as too soft
  • There is a tendency to focus teaching and
    research on hard deterministic areas
  • There is little industry push for research in
    this area
  • There is little employer pull for greater
    education

8
Some partial solutions
  • The SE and HCI communities need to work more
    closely together
  • Core UI development and evaluation topics should
    be considered jointly part of the two fields
  • Improve education by ensuring HCI permeates
    curriculum models, certifications, accreditation,
    etc.
  • Enable corporations to be certified for their
    usability capabilities
  • Would help pull education up by creating a need
    for professionals
  • Clients might come to learn that better software
    comes from such companies

9
Should aspects of HCI be considered also integral
to Software Engineering?
  • I argue yes
  • Usability should have no special status as
    compared to reliability, efficiency,
    maintainability, etc.
  • Design involving users and their needs must drive
    software development

10
Should aspects of HCI be considered also integral
to Software Engineering?
  • Yes, but this does not preclude HCI specialists
  • Analogies
  • A software engineer must be capable of designing
    architectural elements for storing data and
    information
  • But there will always need to be database
    specialists
  • Similar argument can be made for
  • Security experts, performance experts,
    requirements experts, etc.
  • Core UCD techniques (coming slides) can be part
    of the SE field and part of a broader HCI field

11
User-Centred DesignRevisited
12
User centered design (UCD)
  • A term encompassing a range of software
    engineering approaches that focus on the user
  • We have talked about many of the techniques

13
UCD main stages
  • Pre-design
  • Design stage
  • Post-design

14
UCD Pre-design techniques
  • Understand classes of user
  • Understand tasks
  • Perform a competitive analysis
  • i.e. usability testing of competing products
  • Set usability goals
  • using the metrics we discussed earlier

15
UCD Design-stage techniques
  • Parallel design
  • Many developers generate UI ideas in parallel on
    their own
  • They meet together and pick the best ideas
  • Has maximum chance of discovering good ideas,
    without constraining the possibilities too soon

16
UCD Design-stage techniques - 2
  • Involving users
  • Critiquing design
  • Discovering problems
  • with the UI
  • with task descriptions
  • through user testing
  • Contributing ideas
  • But users are not designers so don't ask them
    what do you want?

17
UCD Design-stage techniques - 3
  • Coordinating the interfaces of all parts of the
    system
  • Including documentation, training etc.
  • Applying guidelines and heuristic rules
  • Prototyping

18
UCD Design-stage techniques - 4
  • Performing empirical testing using simulations
    and prototypes
  • To discover problems
  • Measuring usability of the simulations and
    prototypes
  • Performing iterative design
  • Design
  • Evaluate
  • Redesign

19
UCD Post-Design stage
  • Gather on-going feedback about potential
    improvements

20
Another term related to UCD
  • Outside-in development

21
Economics of Usability
22
An example of economic analysis in UCD Economics
of heuristic evaluation
  • We will look at
  • Number of people needed
  • Cost-benefit analysis

23
How many evaluators should you use in a heuristic
evaluation?
  • Given
  • N total number of problems to be found
  • Clearly depends on
  • Quality of the original design or work performed
    so far
  • Size and complexity of the system
  • ? proportion of all usability problems found by
    a single evaluator
  • Studies have shown this averages 34 (range 19
    to 51)
  • The percent of the total number of different
    usability problems found by aggregating reports
    from i evaluators is
  • PercentProblemsFound(i) 1-(1-?)i

24
We can plot the effect of this formula for
different ?
25
We can also look at specific figures for the
average case where ?34
  • Evaluators Percent Problems found
  • 1 34
  • 2 56
  • 3 71
  • 4 81
  • 5 87
  • 6 92
  • 7 95
  • Given a desired percentage of problems to be
    found, we can determine the number of evaluators
    needed
  • by inverting the formula
  • i ln(1-P)/ln(1-l)
  • It is not economical to try to find every problem
    but where should we stop?

26
How much cost savings can we achieve by finding a
given problem?
  • This depends on many factors, but
  • The worst problems are the easiest to find and
    will be found first
  • As we put more effort into finding problems, we
    will find problems that are less and less severe
  • We can model the cost savings from fixing a
    problem as seconds of wasted time per user
    session
  • Very approximate, but useful
  • Some problems will only occur once, then never
    again because the user will learn to avoid them

27
Visualizing the diminishing returns
  • Estimated number of seconds per session saved by
    fixing the nth worst problem found in a
    particular project.

28
Deciding what to fix and the resulting savings
  • From the previous chart, find the total savings
    if we just solve the first n problems
  • We can divide this by the n to find the average
    savings per problem had we just found the first
    n
  • A typical answer for this would be 2 seconds

29
Deciding what to fix and the resulting savings
  • Some data from the previous chart, where there
    were about 200 problems eventually found
  • nth problem s/session saved average savings had
    we just
  • found found the first n problems
  • 1 10.6s 10.6s
  • 10 7.4s 8.9s
  • 20 4.9s 7.4s
  • 40 2.2s 5.4s
  • 60 1.0s 4.1s
  • 80 0.45s 3.2s
  • 100 0.20s 2.6s
  • In a large organization, we can gather data to
    calibrate our own charts and tables like the
    above.

30
Deciding how many evaluators to includeCosts
  • Fixed costs to organize the evaluation (no
    matter how many evaluators)
  • Planning
  • Preparing materials
  • Writing report and communicating results to
    developers
  • Typical estimate for a small system
  • 75 hours work
  • 4500

31
Costs continued
  • Variable costs for each evaluator
  • Weighted salary for time spent
  • Cost of analyzing the evaluator's report
  • Typical estimate for an average system
  • 8 hours work for each evaluator (480)
  • 2 hours work analyzing the report (120)
  • Total 600
  • Total costs 4500 600i
  • Where i is the number of evaluators

32
Deciding how many evaluators to includeBenefits
  • Total benefits vary substantially depending on
    nature of system.
  • In some published studies, the benefit averaged
    15000 per problem
  • On the following slides are two benefit
    calculation examples

33
Benefits example 1small mass-market system
  • Three factors to consider
  • 1. Time saved per session per problem
  • Will obviously vary from problem to problem and
    session to session
  • Reasonable average 2s saved per session per
    problem (similar to data shown earlier)
  • 2. Typical number of sessions of use per system
  • Assume an average of 50 sessions
  • 3. Number of users (licenses)
  • Assume an average of 3000 users

34
Benefits example 1 - continued
  • Therefore total time 2s 50 sessions 3000
    users
  • 300000 seconds per problem!
  • 83.3 hours of saved time
  • If weighted at 60 per hour
  • Potential benefit is 5000 per problem
  • Assuming only half the problems will get fixed
  • Benefit 2500 per problem found
  • Or added value to a license per problem fixed
  • 2s 50 sessions
  • 100s
  • 1.66

35
Benefits example 2small in-house system
  • The same three factors to consider
  • 1. Same average time savings per session (2s)
  • 2. More average sessions over lifetime (200)
  • 3. Fewer users (40)
  • Therefore total time
  • 2s 200 sessions 40 users
  • 16000 s / problem
  • 4.4 hours of saved time
  • Using the same weighted value of 60 per hour
  • Potential benefit is 266 per problem
  • or 133 per problem actually fixed.

36
Cost benefit analysis
  • So will a usability study be economical?
  • Lets put the costs and benefits together

37
Example 1 Mass-market system
  • Insp Total benefit probs total ben/cost
  • cost /prob found benefits ratio
  • 2 5700 2500 56 280000 49
  • 3 6300 2500 71 355000 56
  • 4 6900 2500 81 405000 59
  • 5 7500 2500 87 435000 58
  • 6 8100 2500 92 460000 57
  • 7 8700 2500 95 475000 55

38
Example 2 In-house system
  • Insp Total benefit probs total ben/cost
  • cost /prob found benefits ratio
  • 1 5098 132 34 8976 1.8
  • 2 5700 133 56 14896 2.6
  • 3 6300 133 71 18886 3.0
  • 4 6900 133 81 21546 3.1
  • 5 7500 133 87 23142 3.1
  • 6 8100 133 92 24472 3.0
  • 7 8700 133 95 25270 2.9

39
Odds and Ends
40
Prototyping
  • A prototyping tool should
  • Be easy to learn and use
  • Possess good visual capabilities
  • Allow easy extension
  • Give full support for the types of interface
    being developed
  • Support modular programming

41
Types of prototyping
  • Horizontal prototyping
  • Complete UI, but no underlying functionality
  • Vertical prototyping
  • In-depth functionality of only certain parts
  • Scenario prototyping
  • Selected paths only

42
The design rationale document
  • Allows you to keep control of the various
    alternatives considered and decisions made
  • Explain your usability decisions so others know
    the reasoning you followed

43
Discount usability engineering
  • User and task observation
  • Cheap prototyping and scenarios
  • Simplified thinking aloud
  • Heuristic evaluation
  • Key goal, communicate qualitative information to
    developers in order to improve interfaces
  • as opposed to qualitative measures
  • A study was performed of iterative design
    involving discount usability engineering
  • Improvements of 38 after each stage

44
Consumability - Closely allied to usability
  • Ability for customers to get rapid return on
    investment.
  • Factors include
  • all aspects of usability
  • installability / uninstallability
  • lack of inter-product dependency/interaction
  • low footprint lack of run-time resource
    consumption
  • backward-compatibility
  • integratability
  • appropriate prioritization of iterations and
    features to yield business value
  • low purchase cost

45
Rating corporate maturity with respect to
usability
46
The Model for Usability Maturity (MUM)(proposed
by Lethbridge!)
  • A 5-level CMM-like categorization of
    organizations
  • Based on their interaction with human end-users
    in software development
  • (Each level beyond 1 builds on the previous
    levels)
  • Level 1 Haphazard
  • Level 2 Defined input from users / usability
    awareness
  • Level 3 Iterative interaction with users /
    design for usability
  • Level 4 Controlled and measured involvement of
    users
  • Level 5 Continually improving usability

47
Level 1 Haphazard
  • If you can get users to use the system, then it
    is considered good enough!

48
Level 2 Defined input from users / usability
awareness
  • Users involved in requirements reviews
  • Feedback from users at reviews is incorporated
    into the next stage
  • Design team members have basic training in
    usability
  • Design team adheres to usability standards for
    look and feel
  • Reuse of well-understood controls, styles etc.

49
Level 3 Iterative interaction with users/
design for usability
  • Users actively involved in decision making
  • Use case / task analysis
  • Competitive analysis
  • Design with careful attention to usability
    guidelines
  • Usability design decisions are carefully analyzed
    and the decisions are recorded
  • E.g. options analysis, tables of pros and cons

50
Level 3 Continued
  • User input and feedback with repeated prototypes
  • From paper prototypes to functional prototypes
  • Informal qualitative observations of users or
    heuristic evaluations
  • Discount usability engineering

51
Level 4 Controlled and measured involvement of
users
  • User input and feedback at all stages
  • Design team has trained usability experts or
    cognitive scientists
  • Careful cost-benefit optimization including
    usability
  • Setting of quantitative usability objectives
  • For all parts of the system that will be subject
    to regular or critical-situation use
  • For learnability, efficiency of use, etc.

52
Level 4 Continued
  • User-centered design with formal usability
    studies
  • Measurement of usability so as to determine
    progress towards goals

53
Level 5 Continually improving usability
  • Active development of new UI understanding and
    innovations
  • Formal experiments to validate new UI modalities
  • metaphors, controls, widgets, styles etc.
  • Anthropological studies of human tasks
  • to enable optimized UI design
  • Scientific study of
  • Users and their work practices
  • Usability and the software engineering process

54
Potential benefits of MUM
  • It provides a visible framework that companies
    can use to incrementally improve their practices
  • If even a few companies adopted it and were
    certified, they might have a competitive
    advantage in some situations
Write a Comment
User Comments (0)
About PowerShow.com