Neural Networks Chapter 9 - PowerPoint PPT Presentation

About This Presentation
Title:

Neural Networks Chapter 9

Description:

... arrangement of output units Nearby outputs correspond to nearby input patterns Feature Map Topology preserving map Self Organizing Map Determine the winner ... – PowerPoint PPT presentation

Number of Views:79
Avg rating:3.0/5.0
Slides: 37
Provided by: Joo67
Category:

less

Transcript and Presenter's Notes

Title: Neural Networks Chapter 9


1
Neural NetworksChapter 9
  • Joost N. Kok
  • Universiteit Leiden

2
Unsupervised Competitive Learning
  • Competitive learning
  • Winner-take-all units
  • Cluster/Categorize input data
  • Feature mapping

3
Unsupervised Competitive Learning
4
Unsupervised Competitive Learning
5
Simple Competitive Learning
  • Winner
  • Lateral inhibition

6
Simple Competitive Learning
  • Update weights for winning neuron

7
Simple Competitive Learning
  • Update rule for all neurons

8
Graph Bipartioning
  • Patterns edges dipole stimuli
  • Two output units

9
Simple Competitive Learning
  • Dead Unit Problem Solutions
  • Initialize weights tot samples from the input
  • Leaky learning also update the weights of the
    losers (but with a smaller h)
  • Arrange neurons in a geometrical way update also
    neighbors
  • Turn on input patterns gradually
  • Conscience mechanism
  • Add noise to input patterns

10
Vector Quantization
  • Classes are represented by prototype vectors
  • Voronoi tessellation

11
Learning Vector Quantization
  • Labelled sample data
  • Update rule depends on current classification

12
Adaptive Resonance Theory
  • Stability-Plasticity Dilemma
  • Supply of neurons, only use them if needed
  • Notion of sufficiently similar

13
Adaptive Resonance Theory
  • Start with all weights 1
  • Enable all output units
  • Find winner among enabled units
  • Test match
  • Update weights

14
Feature Mapping
  • Geometrical arrangement of output units
  • Nearby outputs correspond to nearby input
    patterns
  • Feature Map
  • Topology preserving map

15
Self Organizing Map
  • Determine the winner (the neuron of which the
    weight vector has the smallest distance to the
    input vector)
  • Move the weight vector w of the winning neuron
    towards the input i

16
Self Organizing Map
  • Impose a topological order onto the competitive
    neurons (e.g., rectangular map)
  • Let neighbors of the winner share the prize
    (The postcode lottery principle)
  • After learning, neurons with similar weights tend
    to cluster on the map

17
Self Organizing Map
18
Self Organizing Map
19
Self Organizing Map
  • Input uniformly randomly distributed points
  • Output Map of 202 neurons
  • Training
  • Starting with a large learning rate and
    neighborhood size, both are gradually decreased
    to facilitate convergence

20
Self Organizing Map
21
Self Organizing Map
22
Self Organizing Map
23
(No Transcript)
24
Self Organizing Map
25
Self Organizing Map
26
Feature Mapping
  • Retinotopic Map
  • Somatosensory Map
  • Tonotopic Map

27
Feature Mapping
28
Feature Mapping
29
Feature Mapping
30
Feature Mapping
31
Kohonens Algorithm
32
(No Transcript)
33
Travelling Salesman Problem
34
Hybrid Learning Schemes
supervised
unsupervised
35
Counterpropagation
  • First layer uses standard competitive learning
  • Second (output) layer is trained using delta rule

36
Radial Basis Functions
  • First layer with normalized Gaussian activation
    functions
Write a Comment
User Comments (0)
About PowerShow.com