Neural Networks And Its Applications - PowerPoint PPT Presentation

1 / 24
About This Presentation
Title:

Neural Networks And Its Applications

Description:

Used to modify the connection weights. Sometimes referred as Adaptation. Teaching the ... No external sources to adjust weights. Look for regularities or trends ... – PowerPoint PPT presentation

Number of Views:158
Avg rating:3.0/5.0
Slides: 25
Provided by: fredaho
Category:

less

Transcript and Presenter's Notes

Title: Neural Networks And Its Applications


1
Neural Networks AndIts Applications
  • By
  • Dr. Surya Chitra

2
OUTLINE
  • Introduction Software
  • Basic Neural Network Processing
  • Software Exercise Problem/Project
  • Complementary Technologies
  • Genetic Algorithms
  • Fuzzy Logic
  • Examples of Applications
  • Manufacturing
  • RD
  • Sales Marketing
  • Financial

3
Details of Neural Networks
  • ANNs - Class of parallel processing architectures
  • Loosely based on Biology
  • Limited understanding of Brain
  • Unknowns - build hypothesis verify model
  • NN computing is about machines, not Brains

4
Processing Steps Inside a NeuronElectronic
Implementation
  • Summed
  • Inputs
  • Sum
  • Min
  • Max
  • Mean
  • OR/AND

Add Bias Weight
  • Transform
  • Sigmoid
  • Hyperbola
  • Sine
  • Linear

Inputs
Outputs
Processing Element
5
Major Components
  • Weighting Factors
  • Gives the input the importance it needs
  • Weights are adaptive coefficients
  • Summation Function
  • Weighted sum of all inputs, or
  • They can be combined in different forms
  • Min,max, or several normalizing functions
  • Activation - Vary w.r.t Time
  • Simple algorithms do not use

6
Major Components
  • Transfer Function (TF)
  • Compared with a threshold function
  • More than threshold - signal generated
  • Sometimes threshold is () or (-)
  • Some algorithms add noise
  • Use uniform/varying random noise
  • Some algorithms vary the gain of TF

7
Sigmoid Transfer Function
Transfer 1 Function ?????? ( 1 e
(- sum) )
8
Major Components
  • Scaling and Limiting
  • Limiting
  • Scaling
  • Multiplies scale factor adds offset
  • Checks for Lower Upper bounds
  • Output Function
  • Most algorithms do not have
  • Some inhibit output based on competition
  • Not to participate in learning

9
Major Components
  • Error Function
  • Difference between desired current
  • Raw or absolute error
  • Square/ Cube the error
  • Transformed or scaled
  • Learning Function
  • Used to modify the connection weights
  • Sometimes referred as Adaptation

10
Teaching the Network
  • Supervised Learning
  • Most ANNs trained this way
  • Compare actual minimize errors
  • Complete when a desired performance level is
    reached
  • Training data should be large varied
  • Cross validate the model
  • Compare with data which is not trained
  • Sometimes referred as test data set

11
Teaching the Network
  • Unsupervised Learning
  • Promise of the future for ANNs
  • Very few in use
  • No external sources to adjust weights
  • Look for regularities or trends
  • Co-operation among clusters
  • Inhibitory effect
  • Competition between clusters
  • Need more research in this area

12
Learning Rates
  • Slower rate
  • More adequate training
  • Lot more time
  • Faster rate
  • Inadequate to make fine distinctions
  • Less time
  • Software uses Learning Constants
  • Between 0 and1

13
Learning Laws
  • Hebbs Rule
  • Hopfield Law
  • Delta Rule (Least Mean Square Rule)
  • The Gradient Descent Rule
  • Kohonens Learning Law
  • Compete to learn
  • Winner survives

14
Learning Laws
  • Hebbs Rule
  • If a neuron receives an output from another
    neuron, and if both are highly active (both have
    same sign), the weight between the neurons should
    be strengthened.

15
Learning Laws
  • Hopfield Law
  • If the desired output and the input are both
    active or both inactive, increment the connection
    weight by the learning rate, otherwise decrement
    the weight by the learning rate.

Similar to Hebbs rule specifies magnitude
16
Learning Laws
  • The Delta Rule
  • Continuously modifying the strengths of the
    input connections to reduce the difference (the
    delta) between the desired output value and the
    actual output of a processing element. Derivative
    of the transfer function is used.

Variation of Hebbs rule. Most commonly used.
17
Learning Laws
  • The Gradient Descent Rule
  • The derivative of the transfer function is still
    used to modify delta error and additional
    proportional constant is tied with the learning
    rate to adjust the weights.

Extension of Delta rule. Very commonly used.
18
Learning Laws
  • Kohonens Learning Law
  • The processing elements compete for the
    opportunity to learn, or update their weights.
    The processing element with the largest output
    wins and has the capability to inhibit its
    competitors and exciting its neighbors.

Inspired by learning in biological systems.
19
Processing Element
20
Network Selector
21
Basic Application Categories
  • Prediction
  • Classification
  • Data Association
  • Data Conceptualization
  • Data Filtering

22
Market Forecasting Example
  • OBJECTIVE
  • Estimate product demand to plan for capacity
    expansion, capital spending, product development,
    and plant staffing. (Plant was located in Brazil)

23
Forecasting Variables
  • MACROECONOMIC INDICES
  • Inflation
  • Ind. Consumer Goods Production
  • Ind. Intermediate Goods Production
  • Industrial Food Production
  • Energy Usage in the Food Industry
  • Income in Metropolitan Area
  • Consumer Buying Power

24
Forecasting Using Neural Networks
  • Analysis
  • DATA from 1985 to 1989
  • Monthly Product Sales Volume
  • 7 Economic Indices for Each Month
  • Time Series Model up to 3 lags
  • Non-linear NN Model
Write a Comment
User Comments (0)
About PowerShow.com