Neural%20Networks%20Multilayer%20Perceptron%20(MLP)%20Oscar%20Herrera%20Alc - PowerPoint PPT Presentation

About This Presentation
Title:

Neural%20Networks%20Multilayer%20Perceptron%20(MLP)%20Oscar%20Herrera%20Alc

Description:

A neuron is a cell in the brain. collection, processing, and dissemination of electrical signals ... Compute yj(n) Backward Pass. Calculate dj(n) Update weights ... – PowerPoint PPT presentation

Number of Views:374
Avg rating:3.0/5.0
Slides: 15
Provided by: curso2
Category:

less

Transcript and Presenter's Notes

Title: Neural%20Networks%20Multilayer%20Perceptron%20(MLP)%20Oscar%20Herrera%20Alc


1
Neural NetworksMultilayer Perceptron
(MLP)Oscar Herrera Alcántaraheoscar_at_yahoo.com
2
Outline
  • Neuron
  • Artificial neural networks
  • Activation functions
  • Perceptrons
  • Multilayer perceptrons
  • Backpropagation
  • Generalization

3
Neuron
  • A neuron is a cell in the brain
  • collection, processing, and dissemination of
    electrical signals
  • neurons of gt 20 types, synapses,
    1ms-10ms cycle time
  • brains information processing relies on networks
    of such neurons

4
Biological Motivation
  • dendrites nerve fibres carrying electrical
    signals to the cell
  • cell body computes a non-linear function of its
    inputs
  • axon single long fiber that carries the
    electrical signal from the cell body to other
    neurons
  • synapse the point of contact between the axon of
    one cell and the dendrite of another, regulating
    a chemical connection whose strength affects the
    input to the cell.

5
Artificial neural networks
  • A mathematical model of the neuron is
    McCulloch-Pitts unit
  • Neural networks consists of nodes (units)
    connected by directed links

1 x1 x2 x3 xm
b Bias
wi1
Neuron i
v
S
y
j
Wim
Synaptic Weights
Induced local field
Activation
Inputs
Output
Activation potential
function
  • A bias weight Wi,0 connected to a fixed input
    xi,0 1

6
Activation functions
  1. Step function or Threshold function
  2. Sigmoid function
  3. Hyperbolic tangent function

7
Perceptron learning
  • Learn by adjusting weights to reduce error on
    training set
  • Error correction learning rule
  • Perform optimization search by gradient descent

8
Implementing logic functions
  • McCulloch-Pitts unit can implement any Boolean
    function

9
Expressiveness of perceptrons
  • A perceptron
  • can represent AND, OR, NOT
  • can represent a linear separator (function) in
    input space

10
Multilayer Perceptron (MLP) Architecture
Bias
11
Solve XOR problem using MLPs
  • A two-layer network with two nodes in the hidden
    layer
  • The hidden layer maps the points from non linear
    separable space to linear separable space.
  • The output layer finds a decision line


j (v)
12
Back-propagation Algorithm
1. Initialization. Weights are initialized with
random values whose mean is zero 2.
Presentations of training examples 3. Forward
computation 4.-Backward computation for the
neuron j of the hidden layer l for the
neuron j of the output layer L
5.- Iteration. Repeat step 2 to 4 until Elt
desired error a the momentum parameter is
ajusted h the learning-rate parameter is
ajusted
13
MLP Training
k
j
i
Right
Left
  • Forward Pass
  • Fix wji(n)
  • Compute yj(n)
  • Backward Pass
  • Calculate dj(n)
  • Update weights wji(n1)

y
x
k
j
i
Right
Left
14
Generalization
  • Total Data are divided in two parts
  • Data Training (80)
  • MLP is trained with Data Training
  • Data Test (20)
  • MLP is tested with Data Test
  • Generalization
  • MLP is used with inputs which have never been
    presented in order to predict the outputs
Write a Comment
User Comments (0)
About PowerShow.com