Engineering Psychology PSY 378F - PowerPoint PPT Presentation

1 / 36
About This Presentation
Title:

Engineering Psychology PSY 378F

Description:

Before coin tosser says result, we are in position of uncertainty ... by eliminating these extra keystrokes using bifurcated, no delimeter interface ... – PowerPoint PPT presentation

Number of Views:44
Avg rating:3.0/5.0
Slides: 37
Provided by: justinh
Category:

less

Transcript and Presenter's Notes

Title: Engineering Psychology PSY 378F


1
Engineering PsychologyPSY 378F
  • University of Toronto
  • Fall 2002
  • L5 Information Theory

2
Outline
  • Information Theory
  • Reducing uncertainty
  • Defining the theory
  • How much information three factors
  • Redundancy
  • Information transmission
  • Limitations
  • A design application

3
Four Squares
  • Guess which square Im thinking of
  • I will give you a Y/N answer (like 20 questions)
  • What's the fewest number of questions?

4
Flipping a Coin
  • Volunteer for coin toss
  • Before coin tosser says result, we are in
    position of uncertainty
  • There is a reduction in uncertainty when we find
    out the outcome
  • If I tell you the result is "heads" then I have
    given you some information

5
Rolling the Bones
  • Let's consider that I roll a die, and tell you
    the result
  • There is more information in a statement that the
    die came up four than a coin came up tails, b/c
    six things can happen with die, only two with
    coin
  • Amount of information increases with the number
    of things that might have occurred
  •   How much information?

6
Rolling the Bones
  • Reconsider the coin with one coin, two things
    can occur
  • With two coins, four things can occur
  • With three coins, eight things
  • As number of coins increases by ones, number of
    alternatives increase by powers of two
  • This relationship can be expressed by base 2
    logarithms
  • Since many systems have only two states (e.g.,
    on/off, heads/tails), base 2 is convenient,
    although any base could be used

7
Bits
  • Units of information bits
  • bit binary digit
  • can be only 1 of 2 possible values (0 or 1, true
    or false, yes or no)
  • amount of information in answer to one
    two-alternative-forced-choice (2AFC) question
  • signal detection represents a 1-bit choice
    (signal present or absent)
  • Would you like to play 20 bits

8
Information Theory
  • What is it?
  • A method for quantifying the flow of information
    across tasks of varying complexity
  • A metric for measuring the information processing
    efficiency of the human operator
  • What is information?
  • Reduction of uncertainty
  • how much more do you know about the state of the
    world after exposure to a piece of information
    than you knew before the exposure?

9
How Much Information?
  • What affects the quantity of information?
  • The number of possible events that could occur, N
  • The likelihood (base probabilities) of those
    events, P
  • The sequential constraints (context) of the
    events
  • contingent probabilities
  • Pi/X, the probability of event i given that X
    (the context or sequential constraint) has
    occured

10
Number of Possibilities
  • The number of possible events, N
  • Assume you need to know a particular piece of
    information (e.g., What is the current weather?)
  • Quantity of that information equals the average
    minimum number of true-false (yes-no) questions
    that would need to be asked
  • the answer to each question equals one bit of
    information, if, a priori, both answers are
    equally likely
  • Total stimulus information (HS) in bits
  • HS log2 N

11
Event Likelihood
  • Likelihood of events and information, HS
  • rare events convey more information
  • likely events convey less information
  • HS log2 (1/Pi), where Pi is the probability
    of occurrence of event i
  • if N events are equally likely, then Pi 1/N and
  • HS log2 N
  • optimal order of questions is to ask about common
    events first - otherwise you waste questions

12
Weighted Average
  • Average information, Have conveyed by a group of
    events (warning lights, verbal communications)
  • For N equal probability events
  • For n unequal events with unequal probabilities

weighted average based on
probability
13
Weighted Average
  • Average information, Have conveyed by a group of
    events (cont.)
  • When event probabilities are unequal Have
    decreases
  • Example 4 events with equal probabilities (N 4)
  • Example 4 events with unequal probabilities
  • P .5, .25,.125, .125

14
Context
  • Sequential Constraints and Context
  • Transient expectancies (probabilities)
  • a particular stimulus might be more or less
    informative given the context
  • EXAMPLE football touchdown signal on a close
    play is much more informative then on a clear
    score
  • formulas based on contingent probabilities, PiX

15
Information Theory-Summary
  • Three variables influence amount of information
    in a series of events
  • 1) number of possible events, N
  • 2) probability of each event
  • 3) context

16
Information Redundancy
  • Occurs when events not equally probable or have
    sequential constraints
  • Average amount of information, Have, is reduced
    compared to theoretical maximum based on equal
    probabilities, Hmax
  • Redundancy percentage loss of information

17
English Language
  • English language is highly redundant because
  • All letters not equally probable (e vs. x)
  • Sequential constraints (ed, th, etc)--certain
    letters go together
  • R 1 - (Have/Hmax)
  • Have is approx. 1.5 for letters of the alphabet
    Hmax is log2(26) 4.7 bits
  • Hence redundancy is .68 or 68

18
  • Suggests that
  • mny of letrs r not ncesary for cmprehsin

19
Information Transmission
  • Party game
  • So we beat on, boats against the current, borne
    back ceaselessly into the past
  • F. Scott Fitzgerald, The Great Gatsby

20
Information Transmission
  • Transmission of information
  • Information theory helps us think of the human
    operator as an information channel a conduit for
    information
  • Secretary taking dictation soldier interpreting
    Morse code
  • definitions of information (H) along the channel
  • HS stimulus information
  • HR response information
  • HT information transmitted by operator
  • HL information lost by operator
  • HSR information dispersion
  • Noise non-relevant information

21
Ideal Channel
  • Ideal human information channel
  • No information lost
  • HS HT HR

22
Realistic Channels
  • More realistic human information channels

23
Another View
  • Venn diagram

24
Computing HT
  • For a group of events, HS and HR are determined
    using the equation for Have
  • HT is determined by HT HS HR - HSR
  • HSR represents the dispersion of
    stimulus-response relationships (does a
    particular stimulus always elicit the same
    responses?)

25
Computing HSR
  • Determining HSR (and hence, HT) compute average
    of values within matrix

HSR log2(4) 2 bits HSR log2(8) 3 bits
26
Transmission Rate
  • Can talk of information transmission rate--bits/s
    if know how long subject took
  • E.g., with control design alpha, operator
    transmits 3 bits/second, with control design
    beta, operator transmits 4 bits/second
  • Measure of bandwidth

27
Limitations
  • Limitations of Information Theory
  • HT reflects only the consistency of mappings
    between stimuli and responses not the accuracy or
    appropriateness of the stimulus-response mappings
  • Example imagine that an operator always
    responded no when they detected a stimulus
  • HT also does not take into account the size of an
    error

28
Example Application
  • Raskin, J. (2000) The Humane Interface. Boston
    Addison-Wesley.
  • Raskin shows how information theory can be used
    in design to help choose among design
    alternatives
  • Measurement of information efficiency of interface

29
Information Efficiency
  • Information efficiency of interface, E
  • (Minimum amount of information necessary to do a
    task) ?(amount of information that has to be
    specified by a user)
  • Analogous to efficiency as measured in
    thermodynamics furnace efficiency
  • Range of E between 0 and 1

30
Information Efficiency
  • E is 0 when user required to provide information
    that is unneccesary
  • e.g., dialog box that allows user only one
    possible action (pushing OK produces 0 bits)

31
Hal Example
  • Hal works at a computer typing reports
  • Occasionally asked by a researcher to convert a
    temperature reading from Fahrenheit to Celsius
  • Constraints
  • Average of four typed characters needed
  • About 25 of temperatures negative
  • Decimal point occurs in 90 of inputs (assume
    decimal pt)

32
Hal Example
33
Hal Example
  • Character on keyboard supplies about 5 bits of
    information
  • Average length of input 4 keystrokes
  • 4 x 5 20 bits
  • But only about 11 bits required to do task
  • Produces information efficiency of 55 (11.4/20)

34
Hal Example
  • Consider that need to specify direction of
    conversion and that have finished entering values
  • Brings amount of information that has to be
    specified by a user to 20 5 1 26 bits
  • Information efficiency now 11.4/26 44

35
Hal Example
  • Can improve things by eliminating these extra
    keystrokes using bifurcated, no delimeter
    interface
  • Brings amount of information that has to be
    specified by a user to 20 5 1 26 bits
  • Information efficiency now back to 11.4/20 55

36
Summary
  • Information Theory
  • Reducing uncertainty
  • Defining the theory
  • How much information three factors
  • Redundancy
  • Information transmission
  • Limitations
  • A design application
Write a Comment
User Comments (0)
About PowerShow.com