Making Decisions: Modeling perceptual decisions - PowerPoint PPT Presentation

About This Presentation
Title:

Making Decisions: Modeling perceptual decisions

Description:

PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2005. Making Decisions: ... Penalize responses with an error Utility function: ... – PowerPoint PPT presentation

Number of Views:145
Avg rating:3.0/5.0
Slides: 23
Provided by: paulsc1
Category:

less

Transcript and Presenter's Notes

Title: Making Decisions: Modeling perceptual decisions


1
Making Decisions Modeling perceptual decisions
2
Outline
  • Graphical models for belief example completed
  • Discrete, continuous and mixed belief
    distributions
  • Vision as an inference problem
  • Modeling Perceptual beliefs Priors and
    likelihoods
  • Modeling Perceptual decisions Actions,
    outcomes, and utility spaces for perceptual
    decisions

3
Graphical Models Modeling complex inferences
Nodes store conditional Probability Tables
A
Arrows represent conditioning
B
C
This model represents the decomposition P(A,B,C)
P(BA) P(CA) P(A) What would the diagram
for P(BA,C) P(AC) P( C) look like?
4
Sprinkler Problem
You see wet grass. Did it rain?
5
Sprinkler Problem contd
  • 4 variables C Cloudy, Rrain Ssprinkler
    Wwet grass
  • Whats the probability of rain? Need P(RW)
  • Given P( C) 0.5 0.5 P(SC)
    P(RC) P(WR,S)
  • Do the math
  • Get the joint
  • P(R,W,S,C) P(WR,S) P(RC) P(SC) P( C)
  • Marginalize
  • P(R,W) SSSC P(R,W,S,C)
  • Divide
  • P(RW) P(R,W)/P(W) P(R,W)/ SR P(R,W)

6
Continuous data, discrete beliefs
7
Continuous data, Multi-class Beliefs
8
Width
position
z
x
y
9
The problem of perception
Decisions/Actions
Image Measurement
Scene Inference
Classify Identify Learn/Update Plan
Action Control Action
Edges
Motion
Color
10
Vision as Statistical Inference
Rigid Rotation Or Shape Change?
What Moves, Square or Light Source?
Kersten et al, 1996
?
?
Weiss Adelson, 2000
11
Modeling Perceptual Beliefs
  • Observations O in the form of image measurements-
  • Cone and rod outputs
  • Luminance edges
  • Texture
  • Shading gradients
  • Etc
  • World states s include
  • Intrinsic attributes of objects (reflectance,
    shape, material)
  • Relations between objects (position, orientation,
    configuration)

Belief equation for perception
12
Image data
O
13
Possible world states consistent with O
s
Many 3D shapes can map to the same 2D image
14
Forward models for perception Built in
knowledge of image formation
Images are produced by physical processes that
can be modeled via a rendering equation
Modeling rendering probabilistically
Likelihood p( I scene)
e
.
g
.
for no rendering noise
p( I scene) ?(I-f(scene))
15
Prior further narrows selection
Select most probable
Priors weight likelihood selections
p(s)p
1
is biggest
p(s)p
3
p(s)p
2
p(s)p
3
Adapted from a figure by Sinha Adelson
16
Graphical Models for Perceptual Inference
Object Description Variables
Lighting Variables
Viewing variables
Q
V
Priors
L
O1
O2
ON
Likelihoods
All of these nodes can be subdivided into
individual Variables for more compact
representation of the probability distribution
17
There is an ideal world
  • Matching environment to assumptions
  • Prior on light source direction

18
Prior on light source motion
19
Combining Likelihood over different observations
Multiple data sources, sometimes
contradictory. What data is relevant? How to
combine?
20
Modeling Perceptual Decisions
  • One possible strategy for human perception is to
    try to get the right answer. How do we model
    this with decision theory?
  • Penalize responses with an error Utility
    function
  • Given sensory data O o1, o2 , , on and
    some world property S perception is trying to
    infer, compute the decision

Value of response
Response is best value
Where
21
Rigid RotationOr Shape Change?
Actions a 1 Choose rigid a 0 Choose
not rigid
U(s,a) a0 a1
s 1 1 0
s 0 0 1
Utility
Rigidity judgment Let s is a random variable
representing rigidity s1 it is rigid s0
not rigid
Let O1, O2, , On be descriptions of the
ellipse for frames 1,2, , n Need
likelihood P( O1, O2, , On s ) Need
Prior P( s )
22
You must choose, but Choose Wisely
  • Given previous formulation, can we minimize the
    number of errors we make?
  • Given
  • responses ai, categories si, current category
    s, data o
  • To Minimize error
  • Decide ai if P(ai o) gt P(ak o) for
    all i?k
  • P( o si) P(si ) gt P(o sk ) P(sk )
  • P( o si)/ P(o sk ) gt P(sk ) / P(si )
  • P( o si)/ P(o sk ) gt T
  • Optimal classifications always involve hard
    boundaries
Write a Comment
User Comments (0)
About PowerShow.com