Backpropagation for PopulationTemporal Coded Spiking Neural Networks PowerPoint PPT Presentation

presentation player overlay
1 / 15
About This Presentation
Transcript and Presenter's Notes

Title: Backpropagation for PopulationTemporal Coded Spiking Neural Networks


1
Backpropagation for Population-Temporal Coded
Spiking Neural Networks
  • July 18 2006 - WCCI/IJCNN 2006
  • Benjamin Schrauwen and Jan Van Campenhout
  • Electronics and Information Systems Department
  • Ghent University
  • www.elis.UGent.be/SNN/

2
Problem
  • Current learning rules for Spiking Neurons have
    convergence problems and only support a single
    coding scheme.

Outline
  • Introduction
  • Population-Temporal Coding
  • Analog Spiking Neuron Approximation
  • ASNA-Prop learning rule
  • Results
  • Conclusions

3
Introduction
  • Spiking Neural Networks
  • Uses spikes to communicate
  • Neuron model is approx. leaky integrator with
    threshold
  • Computationally more powerfull as analog neurons
    Maass
  • Intrinsically able to process temporal
    information
  • Biologically more plausible

4
Introduction
  • But learning is still a problem. Already
    published
  • Unsupervised Spike Timing Dependent Plasticity
  • Genetic Algorithm, Evolutionary Strategy and
    Simulated Annealing
  • Correlation based ALOPEX algorithm
  • Supervised, gradient-based SpikeProp learning
    rule
  • Only supports time-to-first-spike coding
  • Troubled by convergence problems
  • Sensitive to exact weight initialisation
  • Only able to move spikes, has no notion of
    creation and removal of spikes due to parameter
    changes

value 1/?t
t
5
Introduction
SpikeProp
Very discontinous error landscape, resulting in
convergence problems
Desired spike times
6
Population-Temporal Coding
  • Many spike coding hypothesis exist rate,
    population, time-to-first-spike, rank, filter,
    ...
  • This work uses Population-Temporal Coding
  • A combination of both temporal and population
    representation
  • Embodies a large range of different coding
    possibilities

h11
f11
z1
h12
f12
o1
h21
f21
z2
f22
h22
o2
f31
h31
z3
f32
h32
7
Analog Spiking Neuron Approximation
  • To solve the SpikeProp problems due the hard
    thresholding
  • Approximate SNNs by analog approximation with a
    'soft' threshold
  • Allows the calculation of gradients at every time
    step
  • Use this model to contstruct learning rule, but
    apply this to the original spiking neuron model
  • Analog approximation does not need to be
    simulated!

8
Analog Spiking Neuron Approximation
9
ASNA-Prop learning rule
  • Analog spiking neuron approximation is similar to
    a Output Feedback, Locally Recurrent, Globally
    Feed-forward neural network
  • Learning rules for OF-LRNN presented by
    Campolucci and Piazza
  • Used these ideas, but constructed new rules for
    ASNA
  • ASNA-Prop is derived from analog approximation
    but applied to spiking model !
  • See publication for actual math...

10
ASNA-Prop learning rule
  • ASNA-Prop has notion of spike removal and
    creation
  • Supports multiple coding schemes
  • Smooth convergence
  • Improved performance greatly by using Resilient
    PROPagation
  • Heuristic first order backpropagation speedup
  • Was also tried for SpikeProp but didn't work due
    to discontinous error landscape
  • Not sensitive to weight initialisation

11
Results
  • SpikeProp's time-to-first spike coding only
    allows timeless functions yf(x). Implicit
    temporal processing not used.
  • PTR coding allows temporal processing input and
    output are function of time
  • First test non-temporal XOR
  • Rate based input coding (length 0.1s, 1 is coded
    as 500 Hz, 0 by no signal)
  • Output PTR one output, with Gaussian temporal
    filtering
  • 3-4-1 architecture
  • Converges in 80 epochs, enhanced SpikeProp in 120
    epochs (but has 16 times more connections !)

12
Results
20 runs
13
Results
  • Temporal XOR
  • 40 XORs per example, length 50 ms

14
Results
  • Transcoding ISI encoding as input, PTR as output
  • Spiking neurons can speak multiple languages

15
Conclusions
  • New learning rules that eliminates all the
    problems of SpikeProp
  • Support broad range of coding schemes
  • Performance is much better than SpikeProp
  • Future work
  • Apply to much larger benchmark problems
  • Implement rules for other coding schemes (ttfs,
    ISI, ...)
  • Train more parameters to improve performance
  • Influence of alpha need to be further
    investigated
  • www.elis.UGent.be/SNN/
Write a Comment
User Comments (0)
About PowerShow.com