Engineering Psychology PSY 378S - PowerPoint PPT Presentation

1 / 29
About This Presentation
Title:

Engineering Psychology PSY 378S

Description:

Before coin tosser says result, we are in position of uncertainty ... of the world after exposure to a piece of information, than you knew beforehand? ... – PowerPoint PPT presentation

Number of Views:111
Avg rating:3.0/5.0
Slides: 30
Provided by: justinh
Category:

less

Transcript and Presenter's Notes

Title: Engineering Psychology PSY 378S


1
Engineering PsychologyPSY 378S
  • University of Toronto
  • Spring 2006
  • L6 Information Theory

2
Outline
  • Information Theory
  • Reducing uncertainty
  • Defining the theory
  • How much information three factors
  • Redundancy
  • Information transmission
  • Limitations

3
Four Squares
  • Guess which square Im thinking of
  • I will give you a Y/N answer (like 20 questions)
  • What's the fewest number of questions?

4
Flipping a Coin
  • Volunteer for coin toss
  • Before coin tosser says result, we are in
    position of uncertainty
  • There is a reduction in uncertainty when we find
    out the outcome
  • If I tell you the result is "heads" then I have
    given you some information

5
Rolling the Bones
  • Let's consider that I roll a die, and tell you
    the result
  • There is more information in a statement that the
    die came up four than a coin came up tails
  • Amount of information increases with the number
    of things that might have occurred

6
Flipping the Coin
  • Reconsider the coin with one coin, two things
    can occur
  • With two coins, four things, with three coins,
    eight things.
  • As number of coins increases by ones, number of
    alternatives increases by
  • Powers of two!
  • Base 2 logarithms
  • Since many systems have only two states (e.g.,
    on/off, heads/tails), base 2 is convenient,
    although any base could be used

7
Bits
  • Units of information bits
  • bit binary digit
  • Can be only 1 of 2 possible values (0 or 1, true
    or false, yes or no)
  • Amount of information in answer to one
    two-alternative-forced-choice (2AFC) question
  • Signal detection represents a 1-bit choice
    (signal present or absent)
  • 20 Questions? 20 bits
  • Can you identify the object I have in mind with
    just 20 bits of information?

8
Information Theory
  • What is it?
  • A method for quantifying the flow of information
    across tasks of varying complexity
  • A metric for measuring the information processing
    efficiency of the human operator
  • What is information?
  • Reduction of uncertainty
  • How much more do you know about the state of the
    world after exposure to a piece of information,
    than you knew beforehand?

9
How Much Information?
  • What affects the quantity of information?
  • The number of possible events that could occur, N
  • The likelihood (base probabilities) of those
    events, P
  • The sequential constraints (context) of the
    events
  • Contingent probabilities
  • Pi X, the probability of event i given that X
    (the context or sequential constraint) has
    occurred

10
Number of Possibilities
  • The number of possible events, N
  • Assume you need to know a particular piece of
    information (e.g., What is the current weather?)
  • Quantity of that information equals the average
    minimum number of true-false (yes-no) questions
    that would need to be asked (e.g., is it
    snowing?)
  • The answer to each question equals one bit of
    information (if, a priori, both answers are
    equally likely)
  • Total stimulus information (HS) in bits
  • HS log2 N

11
Event Likelihood
  • Likelihood of events and information, HS
  • Rare events convey more information (more
    newsworthy!)
  • Likely events convey less information
  • HS log2 (1/Pi), where Pi is the probability
    of occurrence of event i
  • If N events are equally likely, then Pi 1/N and
  • HS log2 N

12
Weighted Average
  • Average information, Have conveyed by a group of
    events
  • For N equal probability events
  • For n unequal events with unequal probabilities

weighted average based on
probability
13
Weighted Average
  • Average information, Have conveyed by a group of
    events (cont.) (warning lights for subway
    operator)
  • When event probabilities are unequal Have
    decreases
  • Example 4 events with equal probabilities (N 4)
  • Example 4 events with unequal probabilities
  • P .5, .25,.125, .125

14
Context
  • Sequential Constraints and Context
  • Transient expectancies (probabilities)
  • Particular stimulus might be more or less
    informative given the context
  • Football touchdown signal on close play much
    more informative then on clear score
  • Weather It is June. There is snow outside.
  • Formulas based on contingent probabilities, Pi
    X

15
Information Theory-Summary
  • Three variables influence amount of information
    in a series of events
  • 1) Number of possible events, N
  • 2) Probability of each event
  • 3) Context

16
Information Redundancy
  • Occurs when events not equally probable or have
    sequential constraints
  • Average amount of information, Have, is reduced
    compared to theoretical maximum based on equal
    probabilities, Hmax
  • Redundancy percentage loss of information

17
English Language
  • English language highly redundant because
  • All letters not equally probable (e vs. x)
  • Sequential constraints (ed, th, etc)--certain
    letters go together
  • R 1 - (Have/Hmax)
  • Have is approx. 1.5 for letters of the alphabet
    in English text Hmax is log2(26) 4.7 bits
  • Hence redundancy is .68 or 68

18
  • Suggests that
  • mny of letrs r not ncesary for cmprehsin

19
Information Transmission
  • Party game

The best lack all conviction, while the worst are
full of passionate intensity William Butler
Yeats, The Second Coming
20
Information Transmission
  • Information theory helps us think of the human
    operator as an information channel a conduit for
    information
  • Secretary taking dictation soldier interpreting
    Morse code or you taking a garbled phone message
  • Various types of information (H) along the
    channel

21
Ideal Channel
  • Ideal human information channel

Hs
HR
HT
  • No information lost
  • HS HT HR
  • HS stimulus information
  • HR response information
  • HT information transmitted by operator

22
Realistic Channels
HL information lost Noise non-relevant
information
  • More realistic human information channels

Noise
Hs
HR
HT lt Hs
HL
Noise
Hs
HR
HT 0
HL
23
Computing HT
Noise
Hs
HR
HT lt Hs
HL
  • For a group of events, HS and HR are determined
    using the equation for Have
  • HT is determined by HT HS HR - HSR
  • Information dispersion HSR represents the
    dispersion of stimulus-response relationships
    (does a particular stimulus always elicit the
    same responses?)

24
Venn Diagram
25
Computing HSR
  • Determining HSR (and hence, HT) compute average
    of values within matrix

HSR log2(1/.25) 2 bits HSR log2(1/.125)
3 bits
26
Computing HT
HSR log2(4) 2 bits HSR log2(8) 3 bits HT
HS HR HSR HT HS HR HSR HT 2 2
2 2 bits HT 2 2 3 1 bit
27
Transmission Rate
  • Can talk of information transmission rate--bits/s
    if know how long subject took
  • e.g., with cellphone keypad alpha, operator
    transmits 3 bits/second for text messaging, with
    keypad beta, operator transmits 4 bits/second
  • Measure of bandwidth

28
Limitations
  • Limitations of Information Theory
  • HT reflects only the consistency of mappings
    between stimuli and responses not the accuracy or
    appropriateness of the stimulus-response mappings
  • Example imagine that an operator always
    responded no when they detected a target (and
    always responded yes when they did not)
  • In practice not a huge limitation
  • HT also does not take into account the size or
    magnitude of an error

29
Summary
  • Information Theory
  • Reducing uncertainty
  • Defining the theory
  • How much information three factors
  • Redundancy
  • Information transmission
  • Limitations
Write a Comment
User Comments (0)
About PowerShow.com