Computational Properties of Perceptron Networks - PowerPoint PPT Presentation

1 / 20
About This Presentation
Title:

Computational Properties of Perceptron Networks

Description:

Computational Properties of Perceptron Networks. CS/PY 399 Lab Presentation # 3. January 25, 2001 ... (k 3) xk. k = 0. Vector Notation ... – PowerPoint PPT presentation

Number of Views:18
Avg rating:3.0/5.0
Slides: 21
Provided by: informat202
Category:

less

Transcript and Presenter's Notes

Title: Computational Properties of Perceptron Networks


1
Computational Properties of Perceptron Networks
  • CS/PY 399 Lab Presentation 3
  • January 25, 2001
  • Mount Union College

2
Review
  • Problem choose a set of weight and threshold
    values that produce a certain output for specific
    inputs
  • ex. x1 x2 y
  • 0 0 0
  • 0 1 0
  • 1 0 1
  • 1 1 0

3
Inequalities for this Problem
  • for each input pair, sum x1w1 x2 w2
  • since the xis are either 0 or 1, the sums can be
    simplified to
  • x1 x2 sum
  • 0 0 0
  • 0 1 w2
  • 1 0 w1
  • 1 1 w1 w2

4
Inequalities for this Problem
  • output is 0 if sum lt ?, or 1 if sum is gt ?
  • we obtain 4 inequalities for each possible input
    pair
  • x1 x2 y inequality
  • 0 0 0 0
    lt ?
  • 0 1 0 w2
    lt ?
  • 1 0 1 w1
    gt ?
  • 1 1 0 w1 w2
    lt ?

5
Choosing Weights and ? Based on these Inequalities
  • 0 lt ? means that ? can be any positive value
    arbitrarily choose 4.5
  • w2 lt ? , so pick a weight smaller than 4.5 (say
    1.2)
  • w1 gt ? , so lets choose w1 6.0
  • w1 w2 lt ? oops, our values dont work! This
    means well have to adjust our values

6
Choosing Weights and ? Based on these Inequalities
  • we know that w1 must be larger than ?, which must
    be positive, yet the sum of w1 and w2 must be
    LESS THAN ?
  • the only way this can happen is if w2 is NEGATIVE
  • does w2 -1.0 work?
  • how about w2 -2.0?
  • Still guesswork, but with some guidance

7
A more systematic approach
  • try this example from Lab 2
  • ex. x1 x2 y
  • 0 0 0
  • 0 1 1
  • 1 0 0
  • 1 1 0
  • First, 0 lt ?, so pick ? 7
  • Next, w2 gt ?, say 10

8
A more systematic approach
  • Now consider w1 w2 lt ? w1 10 lt 7
  • Solving this for w1, we find that any value of w1
    lt -3 will work
  • also, w1 lt ? i.e. w1 lt 7
  • this constraint will be satisfied for any value
    of w1 less than -3
  • Try these weights and threshold to see if they
    work

9
An Example
  • Can you find a set of weights and a threshold
    value to compute this output?
  • ex. x1 x2 y
  • 0 0 1
  • 0 1 0
  • 1 0 1
  • 1 1 1

10
Back to Last Weeks lab
  • We found that a single perceptron could not
    compute the XOR function
  • Solution set up one perceptron to detect if x1
    1 and x2 0
  • set up another perceptron for x1 0 and x2 1
  • process the outputs of these two perceptrons and
    produce an output of 1 if either produces a 1
    output value

11
A Nightmare!
  • Even for this simple example, choosing the
    weights that cause a network to compute the
    desired output takes skill and lots of patience
  • Much more difficult than programming a
    conventional computer
  • OR function if x1 x2 gt 1, output 1 otherwise
    output 0
  • XOR function if x1 x2 1, output 1 otherwise
    output 0

12
There must be a better way.
  • These labs were designed to show that manually
    adjusting weights is tedious and difficult
  • This is not what happens in nature
  • What weight should I choose for this
    connection?
  • Formal training methods exist that allow networks
    to learn by updating weights automatically
    (explored next week)

13
Expanding to More Inputs
  • artificial neurons may have many more than two
    input connections
  • calculation performed is the same multiply each
    input by the weight of the connection, and find
    the sum of all of these products
  • notation can become unwieldy
  • sum x1w1 x2w2 x3w3 x100w100

14
Some Mathematical Notation
  • Most references (e.g., Plunkett Elman text) use
    mathematical notation
  • Sums of large numbers of terms are represented
    with Sigma (?) notation
  • previous sum is denoted as
  • 100
  • ? xkwk
  • k 1

15
Summation Notation Basics
  • Terms are described once, generally
  • Index variable shows range of possible values
  • Example
  • 5
  • ? k / (k - 1) 3/2 4/3
    5/4
  • k 3

16
Summation Notation Example
  • Write the following sum using Sigma notation
  • 3x0 4x1 5x2 6x3 7x4 8x5
    9x6 10x7
  • Answer
  • 7
  • ? (k 3) xk
  • k 0

17
Vector Notation
  • The most compact way to specify values for inputs
    and weights when we have many connections
  • The ORDER in which values are specified is
    important
  • Example if w1 3.5, w2 1.74, and w3 18.2,
    we say that the weight vector w (3.5, 1.74,
    18.2)

18
Vector Operations
  • Vector Addition adding two vectors means adding
    the values from the same position in each vector
  • result is a new vector
  • Example (9.2, 0, 17) (1, 2, 3) (10.2,
    2, 20)
  • Vector Subtraction subtract corresponding
    values
  • (9.2, 0, 17) - (1, 2, 3) (8.2, -2, 14)

19
Vector Operations
  • Dot Product mathematical name for what a
    perceptron does
  • x m x1m1 x2m2 x3m3 xlastmlast
  • Result of a Dot Product is a single number
  • example (9.2, 0, 17) (1, 2, 3) 9.2 0
    51 60.2

20
Computational Properties of Perceptron Networks
  • CS/PY 399 Lab Presentation 3
  • January 25, 2001
  • Mount Union College
Write a Comment
User Comments (0)
About PowerShow.com