Artificial Neural Networks ECE.09.454/ECE.09.560 Fall 2006 PowerPoint PPT Presentation

presentation player overlay
About This Presentation
Transcript and Presenter's Notes

Title: Artificial Neural Networks ECE.09.454/ECE.09.560 Fall 2006


1
Artificial Neural NetworksECE.09.454/ECE.09.560
Fall 2006
Lecture 1September 18, 2006
  • Shreekanth Mandayam
  • ECE Department
  • Rowan University
  • http//engineering.rowan.edu/shreek/fall06/ann/

2
Japan's humanoid robots Better than people Dec
20th 2005 TOKYOFrom The Economist print
edition Why the Japanese want their robots to act
more like humans
3
Why the Japanese want their robots to act more
like humans
  • HER name is MARIE, and her impressive set of
    skills comes in handy in a nursing home.
  • MARIE can walk around under her own power. She
    can distinguish among similar-looking objects,
    such as different bottles of medicine, and has a
    delicate enough touch to work with frail
    patients.
  • MARIE can interpret a range of facial expressions
    and gestures, and respond in ways that suggest
    compassion. Although her language skills are not
    ideal, she can recognise speech and respond
    clearly.
  • Above all, she is inexpensive.
  • Unfortunately for MARIE, however, she has one
    glaring trait that makes it hard for Japanese
    patients to accept her

4
Why the Japanese want their robots to act more
like humans
  • .she is a flesh-and-blood human being from
    the Philippines. If only she were a robot
    instead.

5
Harveian Oration
  • In celebration of cerebration
  • by Professor Colin Blakemore,
  • presented at the Royal College of
  • Physicians, London, UK,
  • on Oct 18, 2005
  • www.thelancet.com Vol 366 Dec 10, 2005

6
Plan
  • What is artificial intelligence?
  • Course introduction
  • Historical development the neuron model
  • The artificial neural network paradigm
  • What is knowledge? What is learning?
  • The Perceptron
  • Widrow-Hoff Learning Rule
  • The Future.?

7
Artificial Intelligence
8
Course Introduction
  • Why should we take this course?
  • PR, Applications
  • What are we studying in this course?
  • Course objectives/deliverables
  • How are we conducting this course?
  • Course logistics
  • http//engineering.rowan.edu/shreek/fall06/ann/

9
Course Objectives
  • At the conclusion of this course the student will
    be able to
  • Identify and describe engineering paradigms for
    knowledge and learning
  • Identify, describe and design artificial neural
    network architectures for simple cognitive tasks

10
Biological Origins
11
Biological Origins
12
History/People
1940s Turing General problem solver, Turing test
1940s Shannon Information theory
1943 McCulloch and Pitts Math of neural processes
1949 Hebb Learning model
1959 Rosenblatt The Perceptron
1960 Widrow LMS training algorithm
1969 Minsky and Papert Perceptron deficiency
1985 Rumelhart Feedforward MLP, backprop
1988 Broomhead and Lowe Radial basis function neural nets
1990s VLSI implementations
1997 IEEE 1451
13
Neural Network Paradigm
Stage 1 Network Training
Artificial Neural Network
Present Examples
knowledge
Stage 2 Network Testing
Artificial Neural Network
New Data
14
ANN Model
x Input Vector
y Output Vector
Artificial Neural Network
f Complex Nonlinear Function
f(x) y
knowledge
15
Popular I/O Mappings
16
The Perceptron
Activation/ squashing function
wk1
Bias, bk
x1
wk2
x2
S
S
j(.)
Output, yk
Inputs
uk
Induced field, vk
wkm
xm
Synaptic weights
17
Learning
Mathematical Model of the Learning Process
Intitialize Iteration (0)
ANN
w0
x
y(0)
w
x
y
Iteration (1)
w1
x
y(1)
desired o/p
Iteration (n)
wn
x
y(n) d
18
Learning
Mathematical Model of the Learning Process
Intitialize Iteration (0)
ANN
w0
x
y(0)
w
x
y
Iteration (1)
w1
x
y(1)
desired o/p
Iteration (n)
wn
x
y(n) d
19
Error-Correction Learning
wk1(n)
Desired Output, dk (n)
Activation/ squashing function
x1 (n)
Bias, bk
wk2(n)
x2

Output, yk (n)
S
j(.)
S
Inputs
Synaptic weights
-
Induced field, vk(n)
wkm(n)
Error Signal ek (n)
xm
20
Learning Tasks
  • Pattern Association
  • Pattern Recognition
  • Function Approximation
  • Filtering

Classification
21
Perceptron Training Widrow-Hoff Rule (LMS
Algorithm)
w(0) 0
n 0
y(n) sgn wT(n) x(n)
w(n1) w(n) hd(n) y(n)x(n)
n n1
Matlab Demo
22
The Age of Spiritual MachinesWhen Computers
Exceed Human Intelligenceby Ray Kurzweil
Penguin paperback 0-14-028202-5
23
Summary
Write a Comment
User Comments (0)
About PowerShow.com