Several strategies for simple cells to learn orientation and direction selectivity - PowerPoint PPT Presentation

About This Presentation
Title:

Several strategies for simple cells to learn orientation and direction selectivity

Description:

Several strategies for simple cells to learn orientation and direction selectivity ... selective? orientation- selective? Lampl et al 01. Priebe & Ferster 05 ... – PowerPoint PPT presentation

Number of Views:56
Avg rating:3.0/5.0
Slides: 24
Provided by: theswartzf
Category:

less

Transcript and Presenter's Notes

Title: Several strategies for simple cells to learn orientation and direction selectivity


1
Several strategies for simple cells to learn
orientation and direction selectivity
  • Michael Eisele Kenneth D. Miller
  • Columbia University

2
Orientation and Direction Selectivity
OFF
ON
3
orientation- selective?
4
Selected models
  • Simple Hebbian learning rule produces OS (Miller
    94), but not DS (Wimbauer et al 97) for
    unstructured input.
  • Nonlinear Hebbian learning rules produce DS, but
    only for structured input (Feidler et al 97,
    Blais et al 00).
  • More general principles (sparse coding, ICA,
    blind source separation) can explain occurence of
    OS (Olshausen Field 96 Bell Sejnowski 97)
    and DS (van Hateren Ruderman 98), if applied to
    input from natural scenes.

5
Some OS and DS develops early (kittens at time of
eye opening Albus Wolf 84)
Ferret P30-32 correlations decay over a few 100
ms and several mm cortex (Fiser et al 04)
6
(No Transcript)
7
Goal
  • Find rule that robustly produces DS, using only
    unstructured input.
  • Identify underlying principle.

Blind source separation
8
Blind source separation (BSS)
sources
unmixed sources
sensors
mixing
unmixing
9
Blind source separation of random, spontaneous
activity
sources
sensors
random mixing
unmixing
10
Motivation for blind source mixing (BSM)
11
Combining BSM and Hebbian learning
12
Combined learning rule
linear Hebbian
w weight ?w weight-change ? learning
rate x input y output ? multiplicative
constraint
?w ?(xy exy3) - ?w
egt0 blind source separation elt0 blind source
mixing
based on bottom-up approach to blind-source
separation see Independent Component Analysis
Hyvärinen, Karhunen, Oja 2001
13
Important factors
  • spatial correlations Mexican hat
  • distribution of input amplitudes long tails
  • upper weight limits none
  • temporal input filters diverse

4 week old kittens Cai et al 97
14
Simplifications
  • single neuron learning
  • rate-coded
  • only feedforward input
  • arbor function
  • linear neuron model

15
  • Whitened input ? BSM can
    perfectly mix sources.
  • Gradient principle ? convergence

A few analytical results
16
Dependence on initial conditions
e -0.25
rotation ON ? OFF
17
Robustness against parameter changes
?w ?(xy exy3) - ?w
e 0 (Hebb)
e -0.15
e -0.15
e -0.2
e -0.5
e -0.5
OS and DS develop robustly under BSM Hebb
18
Limitations
special initial conditions
input drifting gratings
large negative e BSM dominates
input amplitudes subgaussian distribution
19
Comparision of response distributions
number of responses
response amplitude
20
Other strategies
BSS with structured input
BSM with subgaussian input
Hebb with hard upper w-limit
Hebb with soft upper w-limit
hybrid with unstructured input
hybrid with structured input
hybrid BSS and Hebb with upper weight limit
Any rule that produces OS and DS for structured
and unstructured input?
21
Linear Hebbian rule upper weight limit
Miller 94
22
Conclusions
  • Blind source mixing (BSM) is designed to produce
    an output that responds evenly to many sources.
  • BSM and and Hebbian learning can be combined to a
    simple synaptic learning rule.
  • This rule robustly produces OS and DS while the
    input is unstructured.

BSM Hebb
known
new
BSS
?
?
OS, DS
OS, DS
23
Speculation
external world
internal network
neuron
?
?
learn correlations that are produced externally
BSS
unlearn correlations that are produced
internally BSM
Unlearning of higher-order correlations. Compare
Crick Mitchison 83 unlearning of any-order
correlations.
24
supported by the Swartz Foundation and the Human
Frontiers Science Program
Write a Comment
User Comments (0)
About PowerShow.com