Neural Networks Chapter 9 - PowerPoint PPT Presentation

1 / 38
About This Presentation
Title:

Neural Networks Chapter 9

Description:

Title: PowerPoint-presentatie Author: Joost N. Kok Last modified by: mvwezel Created Date: 1/27/2004 8:57:59 PM Document presentation format: On-screen Show (4:3) – PowerPoint PPT presentation

Number of Views:127
Avg rating:3.0/5.0
Slides: 39
Provided by: Joo67
Category:

less

Transcript and Presenter's Notes

Title: Neural Networks Chapter 9


1
Neural NetworksChapter 9
  • Universiteit Leiden

2
Unsupervised Competitive Learning
  • Competitive learning
  • Winner-take-all units
  • Cluster/Categorize input data
  • Compression through vector quantization.
  • Feature mapping

3
Unsupervised Competitive Learning
4
Unsupervised Competitive Learning
5
Simple Competitive Learning
  • Winner
  • In biological network Lateral inhibition
  • In ANN search for maximum.

6
Simple Competitive Learning
  • Equivalent if ws and xs are normalized to unit
    length winner is the unit closest to x

7
Simple Competitive Learning
  • Update weights for winning neuron only
  • (Standard competitive learning rule.)
  • Moves w towards x.

8
Simple Competitive Learning
  • Update rule for all neurons

9
Simple Competitive Learning
  • insert Figure 9.2.
  • NB if weights and inputs normalized, then
    everything on unit sphere.

10
Simple Competitive Learning
  • Dead Unit Problem Solutions
  • Initialize weights tot samples from the input
  • Leaky learning also update the weights of the
    losers (but with a smaller h)
  • Arrange neurons in a geometrical way update also
    neighbors
  • Turn on input patterns gradually
  • Conscience mechanism make it easier for frequent
    losers to win.
  • Add noise to input patterns

11
Graph Bipartioning
  • Patterns edges dipole stimuli
  • Edges sharing a node will be close together,
    hence tend to end up in same cluster.
  • Two output units.

12
Vector Quantization
  • Classes are represented by prototype vectors
  • For storage and transmission of speech and image
    data.
  • Voronoi tessellation

13
Vector Quantization
14
Learning Vector Quantization
  • Labelled sample data
  • Multiple prototypes per class
  • Update rule depends on current classification If
    winner class is incorrect, then move prototype
    away from input vector!

15
Learning Vector Quantization
16
Feature Mapping
  • Geometrical arrangement of output units
  • Nearby outputs correspond to nearby input
    patterns
  • Feature Map
  • Topology preserving map

17
Self Organizing Map
  • Determine the winner (the neuron of which the
    weight vector has the smallest distance to the
    input vector)
  • Move the weight vector w of the winning neuron
    towards the input i

18
Self Organizing Map
  • Impose a topological order onto the competitive
    neurons (e.g., rectangular map)
  • Let neighbors of the winner share the prize
    (The postcode lottery principle)
  • After learning, neurons with similar weights tend
    to cluster on the map

19
Self Organizing Map
  • Example for two-dimensioal input.

20
Self Organizing Map
  • Update neighboring weights.

21
Self Organizing Map
  • Input uniformly randomly distributed points
  • Output Map of 202 neurons
  • Training
  • Starting with a large learning rate and
    neighborhood size, both are gradually decreased
    to facilitate convergence

22
Self Organizing Map
23
Self Organizing Map
  • Nonlinear mappings are possible!

24
Self Organizing Map
  • A very nonlinear nonlinear mapping

25
(No Transcript)
26
Self Organizing Map
27
Self Organizing Map
28
Feature Mapping
  • Retinotopic Map spatial organization of the
    neuronal responses to visual stimuli.
  • Somatosensory Map (The somatosensory system is a
    diverse sensory system comprising the receptors
    and processing centres to produce the sensory
    modalities such as touch, temperature,
    proprioception (body position), and nociception
    (pain).)
  • Tonotopic Map (Tonotopy (from Greek tono- and
    topos place) refers to the spatial arrangement
    of where sounds of different frequency are
    processed in the brain. Tones close to each other
    in terms of frequency are represented in
    topologically neighbouring regions in the brain.)

29
Feature Mapping
30
Feature Mapping
31
Feature Mapping
32
Feature Mapping
33
Kohonens Algorithm
34
(No Transcript)
35
Travelling Salesman Problem
36
Hybrid Learning Schemes
supervised
unsupervised
37
Counterpropagation
  • First layer uses standard competitive learning
  • Second (output) layer is trained using delta rule

38
Radial Basis Functions
  • First layer with normalized Gaussian activation
    functions
Write a Comment
User Comments (0)
About PowerShow.com