Simple Perceptrons - PowerPoint PPT Presentation

1 / 11
About This Presentation
Title:

Simple Perceptrons

Description:

... targets have plus/minus 1 values and no values in between those extremes, that is, ... Weight Update Formula, 'Hebbian' from blue book, too complicated ... – PowerPoint PPT presentation

Number of Views:76
Avg rating:3.0/5.0
Slides: 12
Provided by: vititkan
Category:

less

Transcript and Presenter's Notes

Title: Simple Perceptrons


1
Simple Perceptrons
  • Or one-layer feed-forward networks

2
Perceptrons or Layered Feed-Forward Networks
3
Equation governing comp of simple perceptron
activation function, usually nonlinear, e.g. step
function or sigmoid
ksi
4
Threshold or no threshold?
with threshold
without threshold threshold simulated with
connections to an input terminal permanently tied
to -1
5
The General Association (Matching) Task
Is to ask for actual output pattern target
pattern
6
Threshold Units
  • Start with simplest threshold unit, practical for
    1-level perceptrons
  • Also assume the targets have plus/minus 1 values
    and no values in between those extremes, that is,
  • Then all that matter is that for each input
    pattern, the net input (weighted sum) h to each
    output unit has the same sign as the target zeta

7
A Notational Simplification
  • To simplify notation, note that the output units
    are independent
  • In a multilayer nn, however, the hidden
    (non-output) layers arent independent
  • So lets consider only one output at a time
  • Drop the i subscripts

Weights and each input pattern live in the same
space. Advantage can geometrically represent
these two vectors together.
8
New Form for General Association Task geometric
interpretation
Another form
9
A simple learning algorithm
  • Also called the Perceptron Rule
  • Go through the input patterns one by one
  • For each pattern go through the output units one
    by one, asking whether output is the desired one.
  • If so, leave the weight into that unit alone
  • Else in the spirit of Hebb add to each connection
    something proportional to product of the input
    and desired output

10
Simplified Simple Learning Algorithm(for one
neuron case)
  • Start with w 0 (not necessary)
  • Cycle through the learning patterns
  • For each pattern ksi
  • If the output (O) ! desired output (zeta), add
    product of the desired output and the input to w.
    (i.e., w w z x)
  • Keep cycling through the patterns until done.
  • Convergence is guaranteed provided the two
    classes of input points are linearly separable.
  • Perceptron convergence theorem guarantees this

11
Weight Update Formula,Hebbian from blue book,
too complicated
Write a Comment
User Comments (0)
About PowerShow.com