Title: Several strategies for simple cells to learn orientation and direction selectivity
1Several strategies for simple cells to learn
orientation and direction selectivity
- Michael Eisele Kenneth D. Miller
- Columbia University
2Orientation and Direction Selectivity
OFF
ON
3orientation- selective?
4Selected models
- Simple Hebbian learning rule produces OS (Miller
94), but not DS (Wimbauer et al 97) for
unstructured input. - Nonlinear Hebbian learning rules produce DS, but
only for structured input (Feidler et al 97,
Blais et al 00). - More general principles (sparse coding, ICA,
blind source separation) can explain occurence of
OS (Olshausen Field 96 Bell Sejnowski 97)
and DS (van Hateren Ruderman 98), if applied to
input from natural scenes.
5Some OS and DS develops early (kittens at time of
eye opening Albus Wolf 84)
Ferret P30-32 correlations decay over a few 100
ms and several mm cortex (Fiser et al 04)
6(No Transcript)
7Goal
- Find rule that robustly produces DS, using only
unstructured input. - Identify underlying principle.
Blind source separation
8Blind source separation (BSS)
sources
unmixed sources
sensors
mixing
unmixing
9Blind source separation of random, spontaneous
activity
sources
sensors
random mixing
unmixing
10Motivation for blind source mixing (BSM)
11Combining BSM and Hebbian learning
12Combined learning rule
linear Hebbian
w weight ?w weight-change ? learning
rate x input y output ? multiplicative
constraint
?w ?(xy exy3) - ?w
egt0 blind source separation elt0 blind source
mixing
based on bottom-up approach to blind-source
separation see Independent Component Analysis
Hyvärinen, Karhunen, Oja 2001
13Important factors
- spatial correlations Mexican hat
- distribution of input amplitudes long tails
- upper weight limits none
- temporal input filters diverse
4 week old kittens Cai et al 97
14Simplifications
- single neuron learning
- rate-coded
- only feedforward input
- arbor function
- linear neuron model
15- Whitened input ? BSM can
perfectly mix sources. - Gradient principle ? convergence
A few analytical results
16Dependence on initial conditions
e -0.25
rotation ON ? OFF
17Robustness against parameter changes
?w ?(xy exy3) - ?w
e 0 (Hebb)
e -0.15
e -0.15
e -0.2
e -0.5
e -0.5
OS and DS develop robustly under BSM Hebb
18Limitations
special initial conditions
input drifting gratings
large negative e BSM dominates
input amplitudes subgaussian distribution
19Comparision of response distributions
number of responses
response amplitude
20Other strategies
BSS with structured input
BSM with subgaussian input
Hebb with hard upper w-limit
Hebb with soft upper w-limit
hybrid with unstructured input
hybrid with structured input
hybrid BSS and Hebb with upper weight limit
Any rule that produces OS and DS for structured
and unstructured input?
21Linear Hebbian rule upper weight limit
Miller 94
22Conclusions
- Blind source mixing (BSM) is designed to produce
an output that responds evenly to many sources. - BSM and and Hebbian learning can be combined to a
simple synaptic learning rule. - This rule robustly produces OS and DS while the
input is unstructured.
BSM Hebb
known
new
BSS
?
?
OS, DS
OS, DS
23Speculation
external world
internal network
neuron
?
?
learn correlations that are produced externally
BSS
unlearn correlations that are produced
internally BSM
Unlearning of higher-order correlations. Compare
Crick Mitchison 83 unlearning of any-order
correlations.
24supported by the Swartz Foundation and the Human
Frontiers Science Program