Title: Pattern Recognition: Statistical and Neural
1Nanjing University of Science Technology
Pattern RecognitionStatistical and Neural
Lonnie C. Ludeman Lecture 22 Oct 28, 2005
2Lecture 22 Topics
- Review Backpropagation Algorithm
- Weight Update Rules 1 and 2 for Logistic and
Tanh Activation Functions - Output Structure for Neural Net Classifiers
Single,Multiple and Coded output nodes - 4. Words of Wisdom
- 5. Overall Design and Testing Methodology
3Back Propagation Algorithm for Training a
Feedforward neural Network
4Input pattern sample xk
5Calculate Outputs First Layer
6Calculate Outputs Second Layer
7Calculate Outputs Last Layer
8Check Performance
Single Sample Error
Over all Samples Error
Ns - 1
ETOTAL(p) ? ½ ? (dx(p-i) f( wT(p-i)?x(p-i) )2
i 0
Can be computed recursively
ETOTAL(p1) ETOTAL(p) Ep1 (p1) Ep-Ns
(p-Ns )
9Change Weights Last Layer using Rule 1
10Change Weights previous Layer using Rule 2
11Change Weights previous Layer using Modified Rule
2
12Input pattern sample xk1
Continue Iterations Until
13Repeat process until performance is satisfied or
maximum number of iterations are reached.
If performance not satisfied at maximum number
of iterations the algorithm stops and NO design
is obtained.
If performance is satisfied then the current
weights and structure provide the required design.
14Freeze Weights to get Acceptable Neural Net Design
15General Rule 1 for Weight Update
Therefore
16General Rule 2 for Weight Update- Layer L-1
Therefore
and the weight correction is as follows
17where weight correction (general Rule 2) is
w
(L-1)
18Specific Rules for Given Activation Functions
1. Rule 1 for Logistic Activation Function 2.
Rule 2 for Logistic Activation Function 3. Rule
1 for Tanh Activation Function 4. Rule 2 for
Tanh Activation Function
19Rule 1 for Logistic Activation Function
Lth Layer Weight Update Equation
20Rule 2 for Logistic Activation Function
w
(L-1)th Layer Weight Correction Equation
)
where
21Rule 1 for Tanh Activation Function
Lth Layer Weight Update Equation
22Rule 2 for Tanh Activation Function
(L-1)th Layer Weight Correction Equation
where
23Selection of Output Structure for Classifier
Design
(a). Single Output Node (b) N output nodes for N
classes (c) Log2 N output Coded nodes
24(a) Single Output Node
Example four classes with one output node
25(a) Single Output Node
K class case- one output neuron
ti selected as center of Ri
26(b) Ouput Node for Each Class
Example four classes with one output node
1. Select Class Cj if yj is the biggest 2. Select
Class Cj if (y1,y2,y3,y4) is closest to target
vector for Class Cj
Possible Decision Rules
27(b) Ouput Node for Each Class
28?
?
(c) Binary Coded Log2NC Output Nodes
Example four classes with two output nodes
29(c) Binary Coded Log2NC Output Nodes
30Words of Wisdom
It is better to break a big problem down into
several sub problems than to try to find a single
large neural net that will perform the
classification process.
Example Design a neural net to classify
letters from different fonts into individual
letter classes.
Assume that there are 26 classes
representing by the letters S
a,b,c,d,e,f,g,h,I,j,k,l,m,n,o,p,q,r,s,t,u,v,w,x,y,
z
31Solution Design a neural net( Neural Net 1) to
separate classes A1, A2, A3, and A4 then
design four neural networks to break these
classes single letters.
32on Training Set
33Motivation for Momentum Correction !
34Momentum Correction for Backpropagation
Weight update equation
35Summary Lecture 22
- Reviewed Backpropagation Algorithm
- Presented Weight Update Rules 1 and 2 for
Logistic and Tanh Activation Functions - Gave Output Structure for Neural Net Classifiers
Single,Multiple and Coded output nodes - 4. Spoke some Words of Wisdom
- 5. Presented an Overall Design and Testing
Methodology
36End of Lecture 22