LECTURE 3: INTRODUCTION TO QUALITATIVE RESEARCH METHODS AND PROTOCOL ANALYSIS - PowerPoint PPT Presentation

1 / 84
About This Presentation
Title:

LECTURE 3: INTRODUCTION TO QUALITATIVE RESEARCH METHODS AND PROTOCOL ANALYSIS

Description:

... et. al., Human Computer Interaction, 1985, Volume 1, pages 283-307. ... 7 organizations in Canada (2 financial, 4 other companies, one large govt. agency) ... – PowerPoint PPT presentation

Number of Views:2208
Avg rating:3.0/5.0
Slides: 85
Provided by: naomir
Category:

less

Transcript and Presenter's Notes

Title: LECTURE 3: INTRODUCTION TO QUALITATIVE RESEARCH METHODS AND PROTOCOL ANALYSIS


1
LECTURE 3 INTRODUCTION TO QUALITATIVE RESEARCH
METHODS AND PROTOCOL ANALYSIS
  • Intensive study of a small number of cases
  • Qualitative methods result in largely verbal
    rather than numerical data ( eg, interview
    transcripts, videotapes)
  • The analysis is descriptive or
    interpretivist (rather than statistical,
    resulting in tests of association and
    significance)

2
Qualitative Methods, 1 PROTOCOL ANALYSIS
  • or the "Thinking Out Loud" Method
  • A "protocol" is a record of a step-by-step
    procedure. In this method, one records the
    step-by-step procedures of a user "thinking out
    loud" while trying to use a computer system.
  • A qualitative, direct observation method for
    determining usability, and to some extent,
    functionality of software (or other artifacts)

3
Protocol Analysis References
  • Protocol Analysis Verbal Reports as Data, K
    Anders Ericsson, Herbert Simon, MIT Press 1984
    (revised, 1993).
  • Verbal Protocol Analysis Comes of Age, Robert L.
    Mack, RC 10655 (47834) 8/1/84, Research Report,
    IBM Thomas J. Watson Research Center.
  • Lisa Learning by John M. Carroll and S.A. Mazur,
    Computer (IEEE Computer Society), November, 1986
    (good example of what can be learned using
    Protocol Analyses).
  • Exploring Exploring a Word Processor, John M.
    Carroll, et. al., Human Computer Interaction,
    1985, Volume 1, pages 283-307.
  • Boren, M. Ted and Ramey, Judith. Thinking Aloud
    Reconciling theory and practice. IEEE
    Transactions on Professional Communication, 43,
    3, Sept. 2000, 261- 278.

4
THEORETICAL FOUNDATIONS(Ericsson Simon)
  • Assumption Cognitive processes that generate
    verbalizations are a subset of the cognitive
    processes that generate behavior.
  • Objective To discover the process a person goes
    through in solving a problem.

5
Two dimensions of verbal reports
  • Time between task and report leads to distinction
    of concurrent or retrospective report.
  • How directly the information can be expressed in
    verbal language leads to whether the report is a
    direct cognitive report or an abstraction.

6
Three types of reports
  • While information is attended Talk Aloud, Think
    Aloud
  • While information is still in short-term memory
    Concurrent Probing
  • After the completion of the task directed
    process Retrospective probing
  • You want to get as much of the first type as
    possible, but not if it is interrupting them or
    bothering them when they are "thinking hard."

7
Mental levels
  • one direct verbalization of information in form
    it was heeded
  • two not heeded in verbal code, has to be
    translated (icon)
  • three filtering process used.

8
THINKING OUT LOUD INCLUDES RELATING
  • 1) Intentions - goals and future states ("shall,
    will, must have to")
  • 2) Cognitions - current attention and situation,
    presence and immediacy.
  • 3) Planning If x then y. Inferences. Questions
    raised in their minds.
  • 4) Evaluation - no, yes, darn it, fine, etc.
  • These four are elements of what you want them to
    verbalize.

9
Protocol Analysis
  • USE IN INTERACTIVE SYSTEMS
  • OBJECT To understand users cognitive model of
    the system. To pinpoint differences between
    system model and user model. To understand the
    cause of errors, mistakes, misinterpretations.
  • One to one observation of user learning and using
    the system.

10
OBSERVER BEHAVIOR
  • To increase reliability
  • Observer does not aid user observer only
    supposed to ask question if not clear what user
    is doing or why they are doing something.
  • Main probe please keep talking
  • Can record, audio tape or video or keystroke
    save must create a transcript or complete record
    of the behavior.

11
OBSERVER BEHAVIOR
  • Try to get them to explain
  • 1) What they re trying to do.
  • 2) What confusions or concerns they have.
  • 3) What they expect to happen next.
  • 4) Generally try to increase verbalizations by,
    for example, telling them to be sure not to go to
    another screen until they have talked out loud

12
TIMES NOT GOOD TO HAVE THEM VERBALIZE
  • 1) Reading text or attempting to understand a
    written problem description.
  • 2) In intense cognitive activity such as
    reorganizing.
  • 3) Intermediate steps between a sub problem and
    the solution.

13
Procedures
  • Only one or two functional areas of a system can
    be observed in a single protocol analysis.
  • Devise different tasks and procedures to test the
    different parts of the system, separately.

14
III. Procedures
  • 1. YOU must be thoroughly familiar with the
    system to be observed/ tested, and with the
    equipment on which it is to be used.
  • You must know how the task you will design can in
    fact be "correctly" accomplished on the system.
  • You must be able to get the user out of an error
    condition if it occurs.

15
PROCEDURES
  • 2. Develop a fairly simple, well laid out task in
    the user's framework that he or she is to
    accomplish and use that same task on a number of
    users. Create written instructions and materials
    for this task.
  • For example you might mark up some text with
    corrections and have them edit the stored text if
    testing a word processor. If testing a data base
    you might give them the requirements to find
    certain items of data.
  • Adapt or develop Documentation.

16
3. Observe Record
  • Do NOT-- tell them what to do
  • Your only verbalizations should be probes and
    reminders to think out loud talk out loud

17
NOTES ON METHOD
  • Develop follow a script (inter-rater
    reliability .
  • Participants Must be representative of users,
    works only with real users.
  • Instructions should be simple "Tell me what you
    are thinking about as you work." "I have no
    stake in the system." "Only interested in the
    thinking about the task."
  • You have to make clear that you are evaluating
    the system, NOT THEM.

18
Sample transcript excerpts
  • From Carrol and Mack, Learning to use a word
    processor...
  • L Learner E experimenter (sometimes O for
    Observer)
  • L Lets see.. Could I turn it off and start all
    over? Thats what I would do. Will it hurt
    anything?
  • E Youre in control...

19
sample transcript excerpts
  • Learner types ABC right after the operator
    name, which causes an error, signaled by a beep,
    a message at top left, reset light
  • L WHOOPS!!
  • E What do you think happened there?
  • L A bell rang. A buzzer or something...
  • reads instructions ... enter to finish.
    Okay, lets see what happens presses ENTER

20
sample transcript excerpts
  • E You pressed enter?
  • Other probes
  • Tell me what youre thinking?
  • Why do you suppose it did that?
  • What do you think happened there?

21
NOTES on method
  • Try to keep them from blaming themselves.
  • DO Demonstrate THINKING OUT LOUD AS PART OF
    INITIAL INTRODUCTION SCRIPT
  • Use a CONSENT form.

22
NOTES on method
  • Observer must remain an observer, not an
    interviewer. Do not ask about what you want to
    know about (you would be biasing their thinking
    process).
  • Ask What are you thinking? What is that telling
    you?
  • Avoid Why did you do that! What do you think
    that means?
  • Save specific questions you may have about their
    actions for the end of the session.

23
NOTES on method
  • Give help only if it is a real need or a dead
    end. Allow several minutes to pass before
    deciding that it is a dead end. Try to avoid
    subject going into a stress overload.
  • Gather all complaints and try to tie them to
    system task or documentation, Check across
    subjects, tie results to specifics.

24
NOTES on method
  • 4. Entire session should be recorded and then be
    reconstructed as a transcript that includes
    description of
  • a. exact verbalizations by experimenter and
    subject
  • b. exactly what the user did.
  • c. separate notation or analysis of
    interpretation of causes of errors made.
  • SAYS DOES INTERPRETATION
  • Verbalization action (by you)

25
AFTER SESSION
  • Administer a set of post questions.
  • Usually open ended maybe some structured
    questions ("scales")

26
PROCEDURES
  • 5. Then the transcripts are analyzed to find
    COMMON, repeated sources of problems. Exact
    quotes and action sequences are given to
    illustrate the problems encountered by more than
    one of the subjects.
  • First produce complete record (transcript,
    recording). (This is the "protocol.")
  • Then classify places and causes of problems
    (this is the analysis).

27
Analysis and Report
  • Lewis of IBM "We go through our notes and
    collect episodes in which users are having
    trouble or registering complaints. We make a
    listing of each such episode, coded by
    participant transcript, and keyed to the original
    transcript for checking.... By grouping the
    episodes, we can now collect all those that seem
    to involve a given aspect of the system design.

28
Analysis and Report, cont
  • For example, we collect problems of terminology,
    cursor control, menu flow, restarting
    exercises... Episodes in these groups are then
    examined to determine what separate problems are
    occurring in each area, and if desired, what
    proportion of participants encountered a given
    problem.

29
Analysis and Report- Description
  • (Lewis, cont) - We also attempt to determine WHY
    a given problem is occurring. Very often the
    participants comments , as well as the form of
    errors, will suggest something.
  • REPORTING RESULTS BE SPECIFIC. What the
    designers need is specific suggestions for
    change. What words weren't understood? Which
    message or instruction was misleading?
  • Which screens led people astray?

30
ANALYSIS- Prescription
  • Separately, the analyst MAY suggest changes in
    the interface or documentation or system
    flow/functionality in order to avoid or correct
    this source of problems with the system.
  • Or, systems analyst may be left to devise and
    construct possible solutions.

31
Procedure After System Revision
  • Then the "revised" system is AGAIN tested, to see
    if the problems which it caused users have been
    addressed, or if there are NEW problems that
    occur as a result of the "fix."

32
DISADVANTAGES AND LIMITATIONS
  • Realism presence of observer will interfere with
    certain types of reaction like speed. Subject may
    be less careless - better than average
    performance.

33
DISADVANTAGES AND LIMITATIONS
  • Labor A lot of work to do observations and
    analyses.
  • Accuracy People are not aware of all aspects of
    their mental processes. What they say does not
    correspond to what they do. (However, better than
    postmortem interviewing.)
  • Timing Measurement of time of task can not be
    done. They take longer when they have to stop to
    think aloud.

34
DISADVANTAGES AND LIMITATIONS
  • Summarization Cannot obtain summary measures
    easily or statistics.
  • Does not produce "benchmark" kind of
    assessments. If enough subjects, a measuring
    scheme for that particular system and problem can
    be established.

35
Some pitfalls in prior experiments
  • Requiring subjects to tell what they are going to
    do before they do it reduces the number of
    solutions or approaches subjects will take.

36
ADVANTAGES
  • Pinpointing problems specific problems in a new
    system easy to identify.
  • Finding why a problem occurs, problems easy to
    perceive.
  • Catching problems when they occur, people cannot
    remember later.
  • Detecting problems users would not notice. A bad
    sequence of doing something.

37
ADVANTAGES
  • Learning the facts of cognitive life, how users
    approach and think about their task.
  • Studying Attitude-- users will reveal a lot about
    how they feel about the systems. Very important
    for systems where use is discretionary.
  • Using small samples, best feedback possible when
    only pilot or limited use (6-8 per task).
  • Using mockups, can use a small piece of a system,
    even paper and pencil mockups, Wizard of Oz
    approach.

38
GO to Protocol Analysis in Practice
  • Ted Boren Judith Ramey
  • Thinking Aloud Reconciling Theory and Practice,
    IEEE Transactions on Professional Communication,
    September 2000..

39
(No Transcript)
40
End of Lecture 3, Part 1, Protocol Analysis
41
USABILITY STUDIES OTHER OBSERVATIONAL METHODS
  • These are qualitative methods which involve
    use/exercise/ observation of the system, to try
    to identify usability problems.
  • Protocol analysis is the most systematic and
    valid of these methods, but related methods are
    quicker and cheaper.

42
USABILITY INSPECTION METHODS
  • Overview in Report on a Chi '92 Workshop, by
    Robert Mack and Jakob Nielsen, SigChi Bulletin,
    Jan 1993
  • More detailed accounts of individual methods in
    Chi '92 proceedings, pp. 373-404. Note that most
    of those using such methods are industry folks
    with actual products, not "academics" concerned
    with research quality rather than "cost-benefit"
    tradeoffs

43
THE METHODS
  • A small group of "experts" systematically "walk
    through" the screens and critique them based on
    the general "heuristics" of interactive system
    design, such as those reviewed in the Levi
    Conrad reading (e.g., assess consistency of
    command names or menu terms look for clear
    feedback to user after every input)

44
EFFECTIVENESS
  • Comparisons to data gathered from actual users
    show that systematic usability inspection by
    experts finds 30 to 80 of the problems that
    would be identified by users through protocol
    analysis or similar methods.
  • "However, inspections can be effective for
    obtaining usability assessments very early in
    design" (before a prototype is available) and
    they are 'better than nothing."

45
FINDING USABILITY PROBLEMS THROUGH HEURISTIC
EVALUATION
  • Jacob Nielson, (Bellcore), Chi 92 Proc.
  • In this method, a small set of evaluators examine
    the interface and judge its compliance with
    recognized usability principles (the
    "heuristics")
  • Findings- Nielson on Heuristic Evaluation Works
    much better with "usability experts" (people with
    graduate training and experience)

46
FINDING USABILITY PROBLEMS THROUGH HEURISTIC
EVALUATION
  • 3- 5 experts should be used but 2-3 are
    sufficient if "double experts".
  • Applying cognitive walkthroughs to more complex
    user interfaces

47
WHARTON AND BRADFORD (hp LABS) AND FRANZE (US
WEST), Chi 92 Proc., 381-388.
  • Cognitive walkthrough originally devised for
    simple "walk up and use" interfaces.
  • Method focuses on a user's goals and knowledge
    while performing a task.

48
COGNITIVE WALKTHROUGHS, 2
  • Designed to be used iteratively, early in the
    life cycle, by individuals or groups. A set of
    forms is generated, one for each step in a "use"
    process, which asks questions about that step
    (eg, Describe the action that a user should take
    at this point Is this action clearly identified
    for the user? )
  • These forms include many terms from cognitive
    science, such as "goal structures, activation of
    goals..."

49
COGNITIVE JOGTHROUGH
  • ROWLEY RHODES, CHI 92 389--
  • Whereas a rigorous structure in a 'walkthrough"
    forces consideration of all steps involved in a
    task, the jog through may skip some.
  • Instead of transcribing records, they are simply
    videotaped and tabulated on a computer during the
    session (logging software)

50
A Heuristic Evaluation of a World Wide Web
Prototype
  • ACM Interactions, Jul- Aug 1996
  • M. Levi F. Conrad, Bureau of Labor Statistics,
    US govt.
  • Example of a usability inspection method

51
Procedure
  • 1. Prepared two documents for subjects A project
    overview and a summary of usability principles
    (heuristics)
  • 2. Used four User Interface experts from within
    BLS
  • 3. 45 minutes prep, including review of the two
    documents

52
Procedure, cont
  • 4. Evaluators instructed to browse through the
    Web prototype identifying potential usability
    problems and tying each problem to the heuristic
    it violated.
  • 5. Each individual session lasted about 1.5
    hours.

53
Procedure, cont
  • 6. Group of 4 met for about 1.5 hours,
    facilitated by researchers, to construct a
    composite list of violations and heuristics each
    violated.
  • 7. Experimenters formatted the composite list of
    violations as a rating form, and sent it via
    e-mail the next day to each evaluator.
    Evaluators requested to assign severity ratings
    to each violation on a 5 point scale.

54
The Heuristics
  • 1. Speak the users language- Use words, phrases
    and concepts familiar to the user
  • 2. Consistency- indicate similar concepts through
    identical terminology and graphics. Adhere to
    uniform conventions for layout, formatting,
    typefaces, labeling, and so on
  • 3. Minimize the users memory load- Take
    advantage of recognition rather than recall.
    (Give feedback after any action)

55
The Heuristics
  • 4. Flexibility and efficiency of use- Accommodate
    a range of user sophistication and diverse user
    goals.
  • 5. Aesthetic and minimalist design- Create
    visually pleasing displays eliminate irrelevant
    and distracting things on screen to leave white
    space.
  • 6. Chunking- Write material so that documents are
    short and contain exactly one topic. Group lists
    to aid recognition and recall

56
The Heuristics
  • 7. Progressive levels of detail- Organize
    information hierarchically, with more general
    info appearing before more specific details. (not
    mentioned here- Millers 7 or -2)
  • 8. Navigational Feedback- allow the user to
    determine her or his current position in the
    document structure. Make it easy to return to a
    prior state.

57
Severity Rating Scale
  • 0 I dont agree that this is a usability
    problem.
  • 1 Cosmetic problem only- need not be fixed
  • 2 Minor usability problem- low priority
  • 3 major usability problem- important to fix
  • 4 Usability catastrophe- imperative to fix this
    before product can be released.

58
Analysis
  • Experimenters grouped related problems into
    larger classes.
  • Evaluation was fast group completed work in less
    than three days.
  • However, much more time was consumed planning and
    scheduling the sessions, compiling and typing
    lists of problems, and analyzing the final
    results for recommendations.

59
Part 3 Ethnography, Case and Field Studies
  • Ethnography, or participant observation, is one
    data collection method
  • A case study is an intensive study of a specific
    system, organization, community, etc. It usually
    combines several methods of data collection,
    including participant observation and interviews.
  • A field study takes place in a natural
    environment, rather than a laboratory. Case
    studies are one type of field study.

60
Using Ethnography in Contextual Design
  • Simonsen Kensing, CACM July 1997
  • Ethnography is the primary method of
    anthropologists, who spend extended time doing
    field work, living in, observing , and
    recording detailed practices of a society or
    group (participant observation.)

61
Using Ethnography in System Design
  • The aim is to develop thorough understanding of
    current work practices as a basis for the design
    of computer support.
  • Premise people may act much differently than
    they say they do in interviews.

62
Using Ethnography in System Design
  • Ethnographic methods are used in natural settings
    (the field,) are holistic (particular behaviors
    must be understood in the context of the total
    culture and social structure), descriptive of the
    points of view of the members of the group.
  • As applied to CIS, have been used in
    participatory design, in which users and
    designers engage in mutual learning activities to
    understand users work and generate shared
    visions for change.

63
Case Study of the (European) Film Board
  • About 800 applications and 100 actual films made
    each year
  • Editors activities observed by following them for
    several days at their office and site visits,
    meetings, etc. Some videotaped.

64
Case Study of the (European) Film Board
  • Realized there was a power struggle between the
    production managers and the editors eg,
    production managers want fewer productions with
    more money per film.
  • Final system design was different as a result of
    ethnographic data than it would have been
    otherwise. (e.g., after following editors all
    over the place, they realized the system had to
    be portable.)

65
Case Studies Orlikowski MISQ 93
  • Qualitative research results in descriptive data
    (nominal level at best)
  • Methods include interviews, content analysis of
    documents, participant observation
    (ethnography)
  • A case study is an empirical inquiry that
    investigates a phenomenon within its social and
    organizational context (in real life)

66
Orlikowskis study of CASE tools
  • Uses grounded theory approach to studying and
    contrasting two organizations
  • An inductive, iterative theory discovery
    methodology that allows the researcher to develop
    a theoretical account of the general features of
    a topic while simultaneously grounding the
    account in empirical observations.

67
Site selection in case and field studies-
Orlikowski
  • Two organizations selected to be quite different
  • One software company that developed its own CASE
    tools (119 interviews)
  • One petrochemical company that purchased them (40
    interviews)
  • Data collected at both sites through semi
    structured and unstructured interviewing, direct
    observation, and document review.

68
Theoretical Saturation
  • The iteration between data collection and concept
    development ended when enough categories and
    concepts had been defined to explain what had
    been observed at both sites, and no additional
    data suggested adding any more concepts.

69
Results
  • Description of both individual cases presented
    first then comparative analysis tied to theory
  • Differences are attributed to variations in the
    change process, the organizational context, and
    intentions and actions of key players around the
    adoption and use of the CASE tools.

70
Conclusion
  • CASE implementations involve a process of
    organizational change over time, and not merely
    the installation of a new technology.

71
SEARCHING AND SCANNING- EIS STUDY
  • Vandenbosch Huff, MISQ, MARCH 1997
  • Example of qualitative (protocol analysis, direct
    observations, structured interviews) mixed with
    quantitative methods (surveys using scales to
    measure constructs) in field research
  • A field study- employs SYSTEMATIC means for
    obtaining, recording and analyzing the variables
    of interest, and within the organizational
    context of use but sample is small and not
    necessarily representative of anything.

72
What is EIS?
  • Executive Information Systems, sometimes called
    Executive Support Systems, are to support top
    level management in making strategic decisions.
    Names like Commander and Powerplay.
  • One component is a means for summary and analysis
    of the information contained in the lower level
    organizational information systems
  • Another necessary function is to provide external
    info about the environment (eg, changes in law,
    regulations, competitor info, economic and
    demographic projections, etc. )

73
This study
  • A field study of seven organizations using EIS to
    explore how they are actually used by management
    and possible reasons for and outcomes of
    different types of use -- searching for
    specific information, or scanning (browsing) to
    see what interesting things turn up
  • Key questions How and why do executives EIS
    based information retrieval behaviors vary and
    what is the impact of those variations on
    efficiency and effectiveness of the organization?

74
Research Questions
  • Note that qualitative research starts with
    research questions, but not hypotheses
  • Qualitative research may generate hypotheses that
    could subsequently be tested with quantitative
    research methods using random samples

75
SAMPLE
  • 7 organizations in Canada (2 financial, 4 other
    companies, one large govt. agency)
  • From list of 19 gathered from EIS vendors and
    consultants. A minimum of five members of the
    executive team with access to the EIS had to be
    willing to participate
  • On the average, subjects were one level away from
    the president

76
Methods
  • Interview guides for executives and for the
    systems developers. Lasted 45 minutes taped and
    transcribed for analysis
  • Post- interview questionnaire for executives to
    measure individual characteristics and perceived
    effectiveness
  • Direct user observations using thinking out
    loud (protocol analysis procedure), with taping
    and transcribing
  • Written documentation collected and analyzed

77
The Pilot Study
  • Initial study site was treated in part as a
    pretest of the research method. Resulted in
    minor changes in the methods and the evolution of
    the theoretical model
  • 1. When executives were asked about their ability
    to deal with uncertainty (eg, the impact that
    environmental change would have on their
    organizations), they became either confused or
    agitated. Deleted from subsequent interview
    guides.

78
Pilot Study- EIS
  • 2. New construct suggested- predisposition
    towards scanning. Interview guide adjusted to
    begin with a new general question, how the
    interviewee generally used information in his or
    her job.

79
Confidentiality
  • To maintain confidentiality, the companies were
    arbitrarily but consistently, named A, B, C etc
  • Note no individuals identified in any way
    Described in terms such as a satisfied user
    said

80
Analysis Procedures- Four Steps
  • 1. Coding of transcripts separately by two
    analysts in the few cases of disagreement,
    discussion to agreement.
  • 2. Interpretation using the theoretical model
  • 3, Rating of systems by the two judges/
    investigators
  • 4. Comparisons and statistical summary if
    appropriate. Percentages reported and even some
    statistical relationships tested in contingency
    tables, but limitations of small and
    unrepresentative sample explicitly pointed out at
    end.

81
Exploratory Sequential Data Analysis
  • For Continuous Observational Data (such as
    protocol analysis or semi-structured interviews)
  • Fisher Sanderson, ACM Interactions, 1996
  • Observational methods allow one to preserve the
    sequence of events, if they are adequately
    supported by logging (detailed notes) and audio
    or audio/visual recording.
  • Analysis techniques for sequential analysis
    include conversational analysis, interaction
    analysis, verbal and nonverbal protocol analysis,
    etc.

82
Coding
  • All analysis techniques include chunking, or
    dividing segments of adjacent data into units for
    analysis, and
  • Coding, which are category labels placed on the
    units

83
Labor intensity of analysis
  • The ratio of analysis time to the raw sequence or
    interaction time is reported to vary from 21 all
    the way up to 10001 (meaning 1000 minutes spent
    in the analysis steps for every minute of the
    initial raw interaction that was recorded)
  • There are now many software tools available to
    aid in the analysis of qualitative sequential
    interaction data. (e.g., digitized video, makes
    it possible to skip the transcript stage).

84
End of Lecture 3
Write a Comment
User Comments (0)
About PowerShow.com