Neural Networks And Its Applications - PowerPoint PPT Presentation

1 / 20
About This Presentation
Title:

Neural Networks And Its Applications

Description:

And Its Applications By Dr. Surya Chitra OUTLINE Introduction & Software Basic Neural Network & Processing Software Exercise Problem/Project Complementary ... – PowerPoint PPT presentation

Number of Views:168
Avg rating:3.0/5.0
Slides: 21
Provided by: Fred3216
Category:

less

Transcript and Presenter's Notes

Title: Neural Networks And Its Applications


1
Neural Networks AndIts Applications
  • By
  • Dr. Surya Chitra

2
OUTLINE
  • Introduction Software
  • Basic Neural Network Processing
  • Software Exercise Problem/Project
  • Complementary Technologies
  • Genetic Algorithms
  • Fuzzy Logic
  • Examples of Applications
  • Manufacturing
  • RD
  • Sales Marketing
  • Financial

3
Introduction
What is a Neural Network?
  • A computing system made up of a number of highly
    interconnected processing elements, which
    processes information by its dynamic state
    response to external inputs
  • Dr. Robert Hecht-Nielsen

A parallel information processing system based on
the human nervous system consisting of large
number of neurons, which operate in parallel.
4
Biological Neuron Its Function
Information Processed in Neuron Cell Body and
Transferred to Next Neuron via Synaptic Terminal
5
Processing in Biological Neuron
Neurotransmitters Carry information to Next
Neuron and It is Further Processed in Next Neuron
Cell Body
6
Artificial Neuron Its Function
Dendrites
Neuron
Axon
Outputs
Inputs
Processing Element
7
Processing Steps Inside a NeuronElectronic
Implementation
  • Summed
  • Inputs
  • Sum
  • Min
  • Max
  • Mean
  • OR/AND

Add Bias Weight
  • Transform
  • Sigmoid
  • Hyperbola
  • Sine
  • Linear

Inputs
Outputs
Processing Element
8
Sigmoid Transfer Function
Transfer 1 Function ?????? ( 1 e
(- sum) )
9
Basic Neural Network Its Elements
Clustering of Neurons
Bias Neurons
Output Neurons
Hidden Neurons
Input Neurons
10
Back-Propagation NetworkForward Output Flow
  • Random Set of Weights Generated
  • Send Inputs to Neurons
  • Each Neuron Computes Its Output
  • Calculate Weighted Sum
  • I j ? i W i, j-1 X i, j-1 B j
  • Transform the Weighted Sum
  • X j f (I j) 1/ (1 e (Ij T) )
  • Repeat for all the Neurons

11
Back-Propagation NetworkBackward Error
Propagation
  • Errors are Propagated Backwards
  • Update the Network Weights
  • Gradient Descent Algorithm
  • ?Wji (n) ? ?j Xi
  • Wji (n1) Wji (n) ?Wji (n)
  • Add Momentum for Convergence
  • ?Wji (n) ? ?j Xi ? ?Wji (n-1)

Where n Iteration Number ? Learning
Rate ? Rate of Momentum (0 to 1)
12
Back-Propagation NetworkBackward Error
Propagation
  • Gradient Descent Algorithm
  • Minimization of Mean Squared Errors
  • Shape of Error
  • Complex
  • Multidimensional
  • Bowl-Shaped
  • Hills and Valleys
  • Training by Iterations
  • Global Minimum is Challenging

13
Simple Transfer Functions
14
Recurrent Neural Network
15
Time Delay Neural Network
16
Training - Supervised
  • Both Inputs Outputs are Provided
  • Designer Can Manipulate
  • Number of Layers
  • Neurons per Layer
  • Connection Between Layers
  • The Summation Transform Function
  • Initial Weights
  • Rules of Training
  • Back Propagation
  • Adaptive Feedback Algorithm

17
Training - Unsupervised
  • Only Inputs are Provided
  • System has to Figure Out
  • Self Organization
  • Adaptation to Input Changes/Patterns
  • Grouping of Neurons to Fields
  • Topological Order
  • Based on Mammalian Brain
  • Rules of Training
  • Adaptive Feedback Algorithm (Kohonen)

Topology Map one space to another without
changing geometric Configuration
18
Traditional Computing Vs. NN Technology
CHARACTERISTICS TRADITIONAL COMPUTING ARTIFICIAL NEURAL NETWORKS
PROCESSING STYLE Sequential Parallel
FUNCTIONS Logically Via Rules, Concepts Calculations Mapping Via Images, Pictures And Controls
LEARNING METHOD By Rules By Example
APPLICATIONS Accounting Word Processing Communications Computing Sensor Processing Speech Recognition Pattern Recognition Text Recognition
19
Traditional Computing Vs. NN Technology
CHARACTERISTICS TRADITIONAL COMPUTING ARTIFICIAL NEURAL NETWORKS
PROCESSORS VLSI - Traditional ANN Other Technologies
APPRAOCH One Rule at a time Sequential Multiple Processing Simultaneous
CONNECTIONS Externally Programmable Dynamically Self Programmable
LEARNING Algorithmic Adaptable Continuously
FAULT TOLERANCE None Significant via Neurons
PROGRAMMING Rule Based Self-learning
ABILITY TO TEST Need Big Processors Require Multiple Custom-built Chips
20
HISTORY OF NEURAL NETWORKS
TIME PERIOD Neural Network Activity
Early 1950s IBM Simulate Human Thought Process Failed Traditional Computing Progresses Rapidly
1956 Dartmouth Research Project on AI
1959 Stanford Bernard Widrows ADALINE/MADALINE First NN Applied to Real World Problem
1960s PERCEPTRON Cornell Neuro-biologist(RosenBlatt)
1982 Hopfiled CalTech, Modeled Brain for Devices Japanese 5th Generation Computing
1985 NN Conference by IEEE Japanese Threat
1989 US Defense Sponsored Several Projects
Today Several Commercial Applications Still Processing Limitations Chips ( digital,analog, Optical)
Write a Comment
User Comments (0)
About PowerShow.com