Title: Artificial%20Neural%20Networks%20ECE.09.454/ECE.09.560%20Fall%202008
1Artificial Neural NetworksECE.09.454/ECE.09.560
Fall 2008
Lecture 1September 8, 2008
- Shreekanth Mandayam
- ECE Department
- Rowan University
- http//engineering.rowan.edu/shreek/fall08/ann/
2Plan
- What is artificial intelligence?
- Course introduction
- Historical development the neuron model
- The artificial neural network paradigm
- What is knowledge? What is learning?
- The Perceptron
- Widrow-Hoff Learning Rule
- The Future.?
3Artificial Intelligence
4Course Introduction
- Why should we take this course?
- PR, Applications
- What are we studying in this course?
- Course objectives/deliverables
- How are we conducting this course?
- Course logistics
- http//engineering.rowan.edu/shreek/fall08/ann/
5Course Objectives
- At the conclusion of this course the student will
be able to - Identify and describe engineering paradigms for
knowledge and learning - Identify, describe and design artificial neural
network architectures for simple cognitive tasks
6Biological Origins
7Biological Origins
8History/People
1940s Turing General problem solver, Turing test
1940s Shannon Information theory
1943 McCulloch and Pitts Math of neural processes
1949 Hebb Learning model
1959 Rosenblatt The Perceptron
1960 Widrow LMS training algorithm
1969 Minsky and Papert Perceptron deficiency
1985 Rumelhart Feedforward MLP, backprop
1988 Broomhead and Lowe Radial basis function neural nets
1990s VLSI implementations
1997 IEEE 1451
9Neural Network Paradigm
Stage 1 Network Training
Artificial Neural Network
Present Examples
knowledge
Stage 2 Network Testing
Artificial Neural Network
New Data
10ANN Model
x Input Vector
y Output Vector
Artificial Neural Network
f Complex Nonlinear Function
f(x) y
knowledge
11Popular I/O Mappings
12The Perceptron
Activation/ squashing function
wk1
Bias, bk
x1
wk2
x2
S
S
j(.)
Output, yk
Inputs
uk
Induced field, vk
wkm
xm
Synaptic weights
13Learning
Mathematical Model of the Learning Process
Intitialize Iteration (0)
ANN
w0
x
y(0)
w
x
y
Iteration (1)
w1
x
y(1)
desired o/p
Iteration (n)
wn
x
y(n) d
14Learning
Mathematical Model of the Learning Process
Intitialize Iteration (0)
ANN
w0
x
y(0)
w
x
y
Iteration (1)
w1
x
y(1)
desired o/p
Iteration (n)
wn
x
y(n) d
15Error-Correction Learning
wk1(n)
Desired Output, dk (n)
Activation/ squashing function
x1 (n)
Bias, bk
wk2(n)
x2
Output, yk (n)
S
j(.)
S
Inputs
Synaptic weights
-
Induced field, vk(n)
wkm(n)
Error Signal ek (n)
xm
16Learning Tasks
- Pattern Association
- Pattern Recognition
- Function Approximation
- Filtering
Classification
17Perceptron Training Widrow-Hoff Rule (LMS
Algorithm)
w(0) 0
n 0
y(n) sgn wT(n) x(n)
w(n1) w(n) hd(n) y(n)x(n)
n n1
Matlab Demo
18The Age of Spiritual MachinesWhen Computers
Exceed Human Intelligenceby Ray Kurzweil
Penguin paperback 0-14-028202-5
19Summary