Linear Discriminant Functions, - PowerPoint PPT Presentation

1 / 32
About This Presentation
Title:

Linear Discriminant Functions,

Description:

Weld. Heat Affected. Zone (HAZ) Pipe. Outside diameter. Inside diameter. FLOW. Upstream ... Weld Inspection. P. R. Feature extraction & Selection. Discrete ... – PowerPoint PPT presentation

Number of Views:35
Avg rating:3.0/5.0
Slides: 33
Provided by: robi170
Category:

less

Transcript and Presenter's Notes

Title: Linear Discriminant Functions,


1
Chapter 5
Linear Discriminant Functions, The Perceptron
Model The Gradient Descent Procedures
2
Linear Discriminant Functions
  • Objective Designing disc. Functions that are
    linear in x that define decision boundaries
  • How By formulating the problem as one of
    minimizing a criterion function, based on the
    perceptron model
  • Perceptron what? Youll see !
  • What criterion function? A cost function to be
    minimized, such as training error
  • Is it difficult? Yes
  • Why? Because small training error need not mean
    small test error !
  • So what do we do? Use gradient descent
    optimization approaches. Not guaranteed, but
    de-facto standard in engineering optimization
    problems
  • Why should I care? Because it will be in the exam
  • That you will take over and over again after
    you graduate !!!

3
The Perceptron Model
4
Perceptron Decision Boundary
5
Multicategory Case
6
The Solution Vector The Solution REgion
Augmented Vectors
7
The Gradient Descent
8
Homework 5
  • Implement the Parzen window density estimation
    using the Gaussian window function in 1
    dimension. TakeTest it on a number of
    distributions. You can generate random numbers
    from different distributions using the data
    generation commands in the statistics toolbox.
    Then modify your algorithm for 2-dimensions
    (modify Vn accordingly).
  • Implement Algorithms 1 and 2 in your text book
    for PNN.
  • Computer exercises 1 2 from Chapter 4.
  • Reading Assignment Chapter 4172 187, 195-197
    (which you have already read, of course, for last
    weeks class), and Chapter 5 215-227.
  • Yes, there will be (yet another) quiz on
    Wednesday !!!

9
Linear Discriminant Functions
  • Objective Designing disc. Functions that are
    linear in x that define decision boundaries
  • How By formulating the problem as one of
    minimizing a criterion function, based on the
    perceptron model
  • Perceptron what? Youll see !
  • What criterion function? A cost function to be
    minimized, such as training error
  • Is it difficult? butof course!
  • Why? Because small training error need not mean
    small test error !
  • So what do we do? Use gradient descent
    optimization approaches. Not guaranteed, but
    de-facto standard in engineering optimization
    problems
  • Why should I care? Because it will be in the exam

10
The Perceptron Model
11
Perceptron Decision Boundary
12
Multicategory Case
13
The Solution Vector The Solution REgion
Augmented Vectors
14
The Gradient Descent
  • So how do we find the appropriate solution region
    / vector that satisfies aTyigt0?
  • We define a criterion function (again), J(a),
    and minimize it such that a is the solution
    vector
  • This reduces the problem of a massive search into
    a problem of minimizing a scalar function
  • How do we minimize J(a)? you ask
  • Start at some arbitrary point a1, and compute the
    corresponding J(a1)
  • Compute the gradient (what else?) of J(a1)? ?
    J(a1)
  • Obtain the next point a2 by moving in the
    direction of the negative gradient, - ? J(a1), by
    some amount ? (the learning rate).

15
Gradient DescentDe-Mystified!
J(a)
J(a1)

-? J(a1)
J(a2)

Initialize a, ?, ?(k), k0 do k?k1 a
? a- ?(k) .?J(a) until ?(k) .?J(a)lt
? return a end
-? J(a2)
J(a3)

a
a1
a2
a3
?1
?2
16
Some Issues to Consider
  • How too choose the learning rate ??
  • What should be the criterion function?
  • Local / global minimum ?
  • When to terminate ?
  • There are many forms of the gradient descent that
    addresses these issues Newtons descent, the
    momentum term, etc., etc., etc.

17
Newtons Descent
Red Simple gradient descent Black Newton descent
18
The Criterion Function
  • What should be the criterion function?
  • The obvious choice of misclassified training
    data samples, but this is a discontinuous
    function, hence is not differentiable.
  • A better choice The perceptron criterion
    function
  • Geometrically, this is the summation of distances
    from the misclassified samples to the decision
    boundary
  • Then,

Set of samples misclassified by ak
19
The criterion Functions
of patterns misclassified
Perceptron criterion fc.
Bad
Good!
Total squareerror (TSE)
TSE with margin
Best
Better
But can be computationally expensive
20
The Batch Perceptron Algorithm
  • Training data is cycled through until error falls
    below a threshold

The Batch Perceptron Algorithm for finding a
solution vector The next weight vectoris
obtained by adding some multiple of the sum of
the misclassified samples to thepresent weight
vector
21
The Batch Perceptron Algorithm
a(1)0
Error Surface
The bottomof the error surface
22
Single / Multi Layer
Single layer
Two-layer
23
The Multilayer Perceptron
24
Applications
  • Ultrasonic Weld Inspection

25
Weld Inspection
26
Feature extraction Selection
  • Discrete Wavelet Transform

27
DWT of a UT Signal
Ft1 MHz fs10 MHz
28
Gas Sensing
Bare piezoelectric crystal
Central part of the crystal coated with first
gold, and then polymer material
Electrode on back
Electrode on front
Crystal holder
29
Gas Sensing
30
Gas Sensing
31
Gas sensing
32
Homework
Implement computer exercises 2, 3, and 4
Write a Comment
User Comments (0)
About PowerShow.com