Neural Networks Architecture - PowerPoint PPT Presentation

1 / 30
About This Presentation
Title:

Neural Networks Architecture

Description:

The neurons are arranged in separate layers. There is no connection between the neurons ... An analogy with spin-glass models of Ferro- magnetism (Ising model) ... – PowerPoint PPT presentation

Number of Views:63
Avg rating:3.0/5.0
Slides: 31
Provided by: bakt
Category:

less

Transcript and Presenter's Notes

Title: Neural Networks Architecture


1
Neural Networks Architecture
  • Baktash Babadi
  • IPM, SCS
  • Fall 2004

2
The Neuron Model
3
Architectures (1)
  • Feed Forward Networks
  • The neurons are arranged in separate layers
  • There is no connection between the neurons in the
    same layer
  • The neurons in one layer receive inputs from the
    previous layer
  • The neurons in one layer delivers its output to
    the next layer
  • The connections are unidirectional
  • (Hierarchical)

4
Architectures (2)
  • Recurrent Networks
  • Some connections are present from a layer to the
    previous layers

5
Architectures (3)
  • Associative networks
  • There is no hierarchical arrangement
  • The connections can be bidirectional

6
Why Feed Forward?
Why Recurrent/Associative?
7
An Example of Associative Networks Hopfield
Network
  • John Hopfield (1982)
  • Associative Memory via artificial neural networks
  • Solution for optimization problems
  • Statistical mechanics

8
Neurons in Hopfield Network
  • The neurons are binary units
  • They are either active (1) or passive
  • Alternatively or
  • The network contains N neurons
  • The state of the network is described as a vector
    of 0s and 1s

9
The architecture of Hopfield Network
  • The network is fully interconnected
  • All the neurons are connected to each other
  • The connections are bidirectional and symmetric
  • The setting of weights depends on the application

10
Updating the Hopfield Network
  • The state of the network changes at each time
    step. There are four updating modes
  • Serial Random
  • The state of a randomly chosen single neuron will
    be updated at each time step
  • Serial-Sequential
  • The state of a single neuron will be updated at
    each time step, in a fixed sequence
  • Parallel-Synchronous
  • All the neurons will be updated at each time step
    synchronously
  • Parallel Asynchronous
  • The neurons that are not in refractoriness will
    be updated at the same time

11
The updating Rule (1)
  • Here we assume that updating is serial-Random
  • Updating will be continued until a stable state
    is reached.
  • Each neuron receives a weighted sum of the inputs
    from other neurons
  • If the input is positive the state of the
    neuron will be 1, otherwise 0

12
The updating rule (2)
13
Convergence of the Hopfield Network (1)
  • Does the network eventually reach a stable state
    (convergence)?
  • To evaluate this a energy value will be
    associated to the network
  • The system will be converged if the energy is
    minimized

14
Convergence of the Hopfield Network (2)
  • Why energy?
  • An analogy with spin-glass models of Ferro-
    magnetism (Ising model)
  • The system is stable if the energy is minimized

15
Convergence of the Hopfield Network (3)
  • Why convergence?

16
Convergence of the Hopfield Network (4)
  • The changes of E with updating

In each case the energy will decrease or remains
constant thus the system tends to Stabilize.
17
The Energy Function
  • The energy function is similar to a
    multidimensional (N) terrain

Local Minimum
Local Minimum
Global Minimum
18
Hopfield network as a model for associative memory
  • Associative memory
  • Associates different features with eacother
  • Karen ?? green
  • George ?? red
  • Paul ?? blue
  • Recall with partial cues

19
Neural Network Model of associative memory
  • Neurons are arranged like a grid

20
Setting the weights
  • Each pattern can be denoted by a vector of -1s or
    1s
  • If the number of patterns is m then
  • Hebbian Learning
  • The neurons that fire together , wire together

21
Limitations of Hofield associative memory
  • 1) The evoked pattern is sometimes not
    necessarily the most similar pattern to the input
  • 2) Some patterns will be recall more than others
  • 3) Spurious states non-original patterns
  • Capacity 0.15 N

22
Hopfield network and the brain (1)
  • In the real neuron, synapses are distributed
    along the dendritic tree and their distance
    change the synaptic weight
  • In hopfield network there is no dendritic
    geometry
  • If they are distributed uniformly, the geometry
    is not important

23
Hopfield network and the brain (2)
  • In the brain the Dale principle holds and the
    connections are not symmetric
  • The hopfield network with assymetric weights and
    dale principle, work properly

24
Hopfield network and the brain (3)
  • The brain is insensitive to noise and local
    lesions
  • Hopfield network can tolerate noise in the input
    and partial loss of synapses

25
Hopfield network and the brain (4)
  • In brain the neurons are not binary devices, they
    generate continuous values of firing rates
  • Hopfield network with sigmoid transfer function
    is even more powerful than the binary version

26
Hopfield network and the brain (5)
  • In the brain most of the neurons are silent or
    firing at low rates but in hopfield network many
    of the neurons are active
  • In sparse hopfield network the capacity is even
    more

27
Hopfield network and the brain (6)
  • In hopfield network updating is serial which is
    far from biological reality
  • In parallel updating hopfield network the
    associative memories can be recalled as well

28
Hopfield network and the brain (7)
  • When the number of learned patterns in hopfield
    network will be overloaded, the performance of
    the network will fall abruptly for all the stored
    patterns
  • But in real brain an overload of memories affect
    only some memories and the rest of them will be
    intact
  • Catastrophic inference

29
Hopfield network and the brain (8)
  • In hopfield network the usefull information
    appears only when the system is in the stable
    state
  • The Brain do not fall in stable states and
    remains dynamic

30
Hopfield network and the brain (9)
  • The connectivity in the brain is much less than
    hopfield network
  • The diluted hopfield network works well
Write a Comment
User Comments (0)
About PowerShow.com