Title: 2L490 Introduction 1
1Neural Networks
- Course 2L490
- Lecturer Rudolf Mak
- E-mail r.h.mak_at_tue.nl
- Course notes H.M.M. ten Eikelder
- Neural Networks
- Webpage Neurale Netwerken (2L490)
2Todays topics
- BNNs versus ANNs
- computing power and future development
- BNNs
- quick overview
- ANNs
- correspondence (with BNNs)
- neuron model
- learning paradigms
- models
- applications
3Neural Computing
- Neuroscience
- The objective is to understand the human brain
- Biologically realistic models of neurons
- Biologically realistic connection topologies
- Neural networks
- The objective is to develop computation methods
- Highly simplified artificial neurons
- Connection topologies that are aimed at
computational effectiveness
4Man versus Machine (hardware)
Numbers Human brain Von Neumann computer (a.d. 2005)
elements 1010 - 1012 neurons 107 - 108 transistors
connections / element 104 10
switching frequency 103 Hz 109 - 1010 Hz
energy / operation 10-16 Joule 10-6 Joule
power consumption 10 Watt 100 - 500 Watt
reliability of elements low reasonable
reliability of system high reasonable
5Man versus Machine (information processing)
Features Human Brain Von Neumann computer
Data representation analog digital
Memory localization distributed localized
Control distributed localized
Processing parallel sequential
Skill acquisition learning programming
6Brain versus computer
- The following two slides have been taken from a
paper by - Hans Moravec
- When will computer hardware match the human brain
7(No Transcript)
8(No Transcript)
9Types of neurons (Kandel et al)
- Sensory neurons
- Carry information for the purpose of perception
and motor coordination - Motor neurons
- Carry commands to control muscles and glands
- Interneurons
- Relay or projection
- Long distance signaling
- Local
- Information processing
10Biological Neuron
- A neuron has four
- main regions
- Cell body (soma)
- Dendrites
- Axon
- Presynaptic terminal
- excitatory
- inhibitory
11Signaling
- All nerve cells signal in the same way through a
- combination of electrical and chemical processes
- Input component produces graded local signals
- Trigger component initiates action potential
- Conductile component propagates action potential
- Output component releases neurotransmitter
- All signaling is unidirectional
12Spike (width 0.2 5ms)
13Pulse Trains
14Some animations
For this topic we visit the web-site Neurobiology
Home Page, Blackwell Science Subtopics Channel
gating during action potential Propagation of
the Action Potential Neurotransmitter action
15Summary of Neuron Firing Behavior
- The behavior is binary, a neuron either fires or
it does not - A neuron doesnt fire if the accumulated activity
stays below threshold - If the activity is above threshold, a neuron
fires (produces a spike) - The firing frequency increases with accumulated
activity until max. firing frequency reached - The firing frequency is limited by the refractory
period of about 1-10 ms
16Organization of the Brain
Central nervous system
Interregional circuits
Taken from the Computational Brain by
Churchland and Sejnowski
Local circuits
Neurons
Dendritic trees
Neural microcircuits
Synapses
Molecules
17Neural Network
18ANNs as a Computational Model
We can distinguish between sequential and
parallel models
- Sequential
- Recursive functions
- Church
- Turing machine
- Turing
- Random Access Machine
- Von Neumann
- Parallel
- P(arallel)RAM
- Cellular automata
- Von Neumann
- Artificial Neural Nets
- McCulloch/Pitts
- Wiener
19Advantages of ANNs
- Efficient
- Inherent massively parallel
- Robust
- Can deal with incomplete and/or noisy data
- Fault-tolerant
- Still works when part of the net fails
- User-friendly
- Learning instead of programming
20Disadvantages of ANNs
- Difficult to design
- The are no clear design rules for arbitrary
applications - Hard or impossible to train
- Difficult to assess internal operation
- It is difficult to find out whether, and if so
what tasks are performed by different parts of
the net - Unpredictable
- It is difficult to estimate future network
performance based on current (or training)
behavior
21BNN-ANN Correspondence
- Nodes stand for the neuron body
- Linear combiners model accumulation of synaptic
stimuli - Nonlinear activation function models firing
behavior - Connections stand for the dendrites and axons
- Synapses are modeled by attaching weights to the
connections - Positive weights for excitatory synapses
- Negative weights for inhibitory synapses
22Artificial Neuron
transfer function
linear combiner
23Discrete asymmetric transfer
Heaviside step-function f(c, x) ( x gt c ? 1
0 )
Transfer functions are also called activation
or squashing functions
24Discrete symmetric transfer
Sign function f(x) ( x gt 0 ? 1 -1 )
Used with bipolar state encoding
25Continuous asymmetric transfer
f(z) 1 / (1 e-cz )
sigmoid function logistic function
26Continuous symmetric transfer
f(z) (ecz - e-cz ) / (ecz e-cz )
tangens hyperbolicus
27Piecewise-Linear Transfer
f(c, z) ( z lt c ? 1 ( z gt c ? 1 z / c )
)
28Local transfer function
- f(z) N (0, 1) (1 / sqrt (2?)) exp (-x2/2)
29Probabilistic Neurons
- Neurons are in one of two states
- x 0 or x 1
- The transfer function P(z) only determines the
probability of finding the output node in a
certain - state
- y 1 with probability P(z)
- y 0 with probability 1 - P(z)
- Common choice for P(z) is
- P(z) 1 / 1 exp (-z / T)
30Specific neuron models
- McCulloch-Pitts neuron
- Discrete (0 - 1) inputs
- Heaviside activation function
- Only weights 1 (excitatory) and -1 (inhibitory)
- Adaline (Widrow Hoff)
- Continuous inputs
- Identity (no) activation function
- Continuous weights
- x0 1, w0 is bias
31Artificial Neural Networks
- Layered net with
- n input nodes
- m output nodes
- zero or more
- hidden layers
- (one shown)
32ANN Models
- Feedforward networks (FANN)
- Single-layer perceptrons (SLP, SLFF) (Rosenblatt)
- Multi-layer perceptrons (MLP, MLFF) (Rumelhart,
) - Radial basis function networks (RBFN)
- Functional link nets (FLN)
- (Neo-)Cognitron (Fukushima)
33ANN Models (continued)
- Recurrent networks (RNN)
- Hopfield networks (Hopfield, Amari)
- Boltzmann machines (Hinton, Sejnowski)
- Bidirectional associative memory (Kosko)
- Competitive learning networks (CLN)
- Simple competitive learning networks
- Self-organizing feature maps (Kohonen)
- Adaptive resonance theory (Grossberg)
34Hebbs Postulate of Learning
- Biological formulation
- When an axon of cell A is near enough to
excite a cell and repeatedly or persistently
takes part in firing it, some growth process or
metabolic change takes place in one or both cells
such that As efficiency as one of the cells
firing B is increased. - Mathematical formulation
35Hebbs Postulate revisited
- Stent (1973), and Changeux and Danchin (1976)
- have expanded Hebbs rule such that it also mo-
- dels inhibitory synapses
- If two neurons on either side of a synapse are
activated simultaneously (synchronously), then
the strength of that synapse is selectively
increased. - If two neurons on either side of a synapse are
activated asynchronously, then that synapse is
selectively weakened or eliminated.
36Learning Methods
- Supervised learning
- Reinforcement learning
- Corrective learning
- Unsupervised learning
- Competitive learning
- Self-organizing learning
- Off-line versus adaptive learning
37Learning Tasks
- Association
- Classification
- Clustering
- Pattern recognition
- Function approximation
- Control
- Adaptive filtering
- Data compression
- Prediction
38Application areas (just a few)
- Finance, Banking, Insurance
- Loan approval, stock prediction, claim
prediction, fraud detection - Business, Marketing
- Sale prediction, customer profiling, data mining
- Medicine
- Diagnosis and treatment
- Industry
- Quality control
- Machine/plant control
- Telecommunication
- Adaptive filtering (equalizing)
- Speech recognition and synthesis
39NN for setting target corn yields
40(Optical) Character Recognition
41Applications
vs
42Robocup Four-Legged League
43Brief history
- Early stages
- 1943 McCulloch-Pitts neuron as computing element
- 1948 Wiener cybernatics
- 1949 Hebb learning rule
- 1958 Rosenblatt perceptron
- 1960 Widrow-Hoff least mean square algorithm
- Recession
- 1969 Minsky-Papert limitations perceptron model
- Revival
- 1982 Hopfield recurrent network model
- 1982 Kohonen self-organizing maps
- 1986 Rumelhart et. al. backpropagation
44Literature
- The authoritative text on neural science is
- Principles of Neural Science, fourth edition,
eds. E.R. Kandel, J.H. Schwartz, T.M. Jessell, - Mc-Graw-Hill, 2000.
- The authoritative text on neural networks is
- Neural Networks a comprehensive foundation,
- second edition, Simon Haykin, Prentice-Hall,
- 1999.