RealitE: Usability Testing of Your Website's Effectiveness - PowerPoint PPT Presentation

1 / 40
About This Presentation
Title:

RealitE: Usability Testing of Your Website's Effectiveness

Description:

Tasks should cover most important part of the web site. Task should be small - time limits ... Our Survey Data. Sent to more than 10 mailing lists. 2 week ... – PowerPoint PPT presentation

Number of Views:60
Avg rating:3.0/5.0
Slides: 41
Provided by: Sar9321
Category:

less

Transcript and Presenter's Notes

Title: RealitE: Usability Testing of Your Website's Effectiveness


1
Realit-E Usability Testing of Your Website's
Effectiveness
  • Sarah J. Hammill
  • Inna Ilinskaya
  • Florida International University

2
Overview
  • Defining Usability and Usability Testing
  • Whys and Hows of Usability Testing
  • Usability Survey Data
  • Case Studies

3
What is Usability?
  • Depends on who you ask
  • Jacob Nielsen father of usability defines it as
    learnability, memorability, error rate,
    efficiency, and satisfaction.
  • Steve Krug Usability really just means making
    sure that something works well
  • ISO 9241 definition
  • effectiveness the accuracy and completeness with
    which specified users can achieve specified goals
    in particular environments
  • efficiency the resources expended in relation to
    the accuracy and completeness of goals achieved
  • satisfaction the comfort and acceptability of
    the work system to its users and other people
    affected by its use

4
Why Do It?
  • Product / Service
  • Placement
  • Price
  • People
  • Promotion

5
How? Usability Assessment Methods
  • User Testing
  • Focus Groups 
  • Observation  
  • Questionnaires Interviews 
  • User Feedback 
  • Card sorting
  • Logging Actual Use
  • Cognitive walkthrough
  • Heuristic Evaluation  

6
Usability Testing Highlights
  • Start at the beginning of Web site development
  • Test throughout Web site development. Iterative
    design
  • Conduct several smaller tests - one for each
    major iteration of web interface
  • Ongoing testing
  • Pitfalls
  • Reliability individual differences between test
    users
  • Validity using the wrong users or giving them
    the wrong tasks

7
Usability Testing
  • Clarify purpose of the test
  • Formative evaluation thinking-aloud method
  • vs.
  • Summative evaluation measurement test
  • Pilot Testing
  • One or two pilot subjects
  • Testing with real users
  • Testing five users is typically enough for
    homogeneous groups.
  • Novice vs. Expert users
  • Dont make your test users cry

8
Usability Testing
  • Experimenters matter
  • Knowledge of test method is a plus
  • Extensive knowledge of web site is crucial
  • Can use designers as experimenters
  • University of South Florida Library "resistance
    among those responsible for changing the
    site... has been higher than expected... many
    of the recommended changes have gone
    unimplemented."

9
Usability Testing - Tasks
  • Real tasks from real use scenarios
  • Tasks should cover most important part of the web
    site
  • Task should be small - time limits
  • Tasks should not be humorous or offensive
  • First test task should always be simple in order
    to guarantee the user an early success experience
  • Give test tasks in writing

10
Usability Testing
  • Hand out test tasks one at a time
  • Avoid disruptions
  • Minimize number of observers
  • Do not indicate that user is slow or makes
    mistakes
  • Experimenter should not interfere
  • Debrief user after the test, solicit comments,
    satisfaction questionnaire
  • Answer users question about web site after the
    test.

11
Performance Measurement
  • Users perform a set of tasks
  • Collect quantitative data
  • time data average time it takes user to perform
    task
  • error data (number of user errors, time spent
    recovering from errors)
  • Clear definition of when a task starts and when
    it stops
  • Employ other quantifiable usability measurements

12
Thinking Aloud
  • Having test user perform test tasks by
    continuously thinking out loud
  • Shows how users interpret each individual
    interface item
  • Disadvantage does not provide quantitative
    data, can slow user down
  • Advantage wealth of qualitative data
  • Explicit and striking quotes from users

13
Our Survey Data
  • Sent to more than 10 mailing lists
  • 2 week period
  • 16 questions
  • Average time to complete 7 minutes
  • 111 individuals started completed the survey
  • 1166 viewed the survey
  • Possible design flaws

14
Techniques Used to Administer Study
15
Special Software or Other Technology Used?
16
Types of Technology Used
  • Video Camera - 3
  • Morae Screen Capture Software - 6
  • Computer hook-up to an overhead Projector to
    avoid looking over shoulder -1
  • Smart Board - 1
  • WinWhatWhere Software 1, no longer available
    for purchase or evaluation
  • Captivate - 2
  • My Screen Recorder Software - 1
  • Camtasia Studio 11
  • WebCat Online Card sorting application 1
  • Hired Usability Lab - 1
  • NetMeeting 1

17
Morae from TechSmithhttp//www.morae.com
  • Morae License(Includes Manager, 1 Recorder, 1
    Viewer, 1 year Essential Plan technical
    support, upgrades) License for Academic
    Institutions - 1103
  • No free trial version
  • Three components
  • Morae Recorder collects test data records
    screen activity, video and audio input
    compatible only with IE. Captured data can get in
    gigabyte range for tests longer than few minutes
  • Morae Remote Viewer allows to observe and
    annotate user activity real-time over a network
  • Morae Manager analyzes captured data and
    assembles relevant pieces into a presentation.

18
Captivate from Macromedia (formerly
RoboDemo)http//www.macromedia.com/software/capti
vate/
  • Free 30 days trial
  • Cost 499
  • Automatically records all onscreen actions and
    instantly creates an interactive Flash
    simulation.
  • Point and click to add text captions, narration
  • However, primary uses are e-learning
    presentations, how-to tutorials, product
    demonstrations

19
Camtasia Studio by TechSmithhttp//www.CamtasiaSt
udio.com
  • Records Screen Captures Audio Captioning
  • Used to improve PowerPoints, make FLASH videos,
    publish on CD, the Web, or DVD
  • 299.00 per copy
  • 995.00 5-user copy
  • Free 30 Day Trial with chance to win a free copy

20
NetMeeting by Microsofthttp//www.microsoft.com/w
indows/netmeeting/
  • Video Audio Conferencing, Chat, Whiteboard,
    Program Sharing and File Transfer
  • Included in Windows Packages
  • Start, Programs, Accessories, Communications,
    NetMeeting

21
My Screen Recorder http//www.deskshare.com/msr.as
px
  • Capture desktop activity including video with
    sound into standard AVI video files
  • Sound via microphone
  • Built-in opportunity to review video
  • Thumbnail views
  • Free 30 day Trial
  • 29.95

22
CamStudiohttp//www.atomixbuttons.com/
  • Open-Source Software
  • FREE!
  • Records Screen Activity into Standard AVI Video
    Files
  • Soundtrack Option Available

23
Number of Individuals Conducting Study
24
Recruitment We put up a call for help from the
main library homepage. NO ONE responded, however!
25
Incentives
Testing ranged from 5 minutes to 90 minutes per
test-taker
26
Budget and Administrations Role
  • Whose Idea?
  • 64 Librarians
  • 19 Administration
  • 28 Other
  • Budget
  • We were told by administration what to do and
    how to do it--THEIR way

27
Main Problems Identified
28
(No Transcript)
29
(No Transcript)
30
(No Transcript)
31
Interesting Case Studies
32
University of Illinois at Chicago
  • 20 Tasks
  • Participants were given 3 Minutes per Task
  • Learning Curve Variable Tasks were Executed in
    Different Order
  • Desire for Human Contact

33
Hunter College
  • 28 Students
  • Camtasia Rounds of Testing
  • 15 weeks of Testing
  • Shift in questions from quantitative to
    qualitative
  • More Library Experience Better Articulation of
    Changes Needed

34
Washington State University
  • 45 questions
  • Novice vs. expert users No Correlation
  • To Debrief or Not Debrief
  • Confidence ? Navigation

35
University of the Pacific
  • 8 Tasks
  • 134 Students
  • Usability of site was of secondary concern
  • Aim to understand student awareness of library
    resources
  • Non-librarian administration of test
  • Library computers configured to MSN or Yahoo
  • 2nd survey conducted in residence halls

36
Randolph Macon College
  • Implications for information literacy
    instruction
  • Can not abandon instruction
  • Implication for interface design
  • Build instruction into design
  • Simplify interface
  • Integrate spellcheckers (e.g., in OPAC)

37
Western Michigan University Library
  • Few took time to read explanations, descriptions,
    search hints
  • Web search habits

38
Problems Identified
  • Users do not understand terms database,
    resources, catalog
  • Users do not understand difference between
    database and library catalog
  • Users do not understand difference between terms
    citations, article, journal
  • Users do not understand the scope of library
    catalog can search for materials other than
    books
  • Users did not know enough about library (or even
    book) features to persist in their searches until
    they got the right answer.

39
Where Does Usability End Information Literacy
Begin?
40
Thank You!
  • Questions?
  • Comments?
  • Contact us
  • Sarah J. Hammill hammills_at_fiu.edu
  • Inna Ilinskaya ilinskay_at_fiu.edu
Write a Comment
User Comments (0)
About PowerShow.com