Advanced signal Processing - PowerPoint PPT Presentation

1 / 32
About This Presentation
Title:

Advanced signal Processing

Description:

Title: PowerPoint Presentation Author: Djamel Bouchaffra Last modified by: MKHALIL Created Date: 1/8/2001 4:51:43 PM Document presentation format – PowerPoint PPT presentation

Number of Views:46
Avg rating:3.0/5.0
Slides: 33
Provided by: Djame7
Category:

less

Transcript and Presenter's Notes

Title: Advanced signal Processing


1
Advanced signal Processing
  • Mohamad KHALIL
  • Lebanese university

2
Outline
  • Probabilities
  • Classification rules
  • Application in case of
  • Change of mean
  • Change of variance
  • Multidimensional case
  • Equal covariance matrix
  • General case

3
Classification
  • If W1 is the class 1 and W2 is the class 2
  • P(x/w1) distribution of X under W1
  • P(x/w2) distribution of X under w2
  • P(w1) a priori probability of w1
  • P(w2) a priori probability of w2
  • P(w1/x) probability of choosing X in w1
  • P(w2/x) probability of choosing X in w2

4
Classification
  • X will be classified in w1 if
  • P(w1/X)gtp(w2/X)
  • X will be classified in w2 if
  • P(w2/X)gtp(w1/X)
  • Bayes Rule P(A/B)P(B/A).P(A)/P(B), So

5
Classification rules
  • X will be classified in w1 if
  • P(w1/X)gtp(w2/X)
  • X will be classified in w2 if
  • P(w2/X)gtp(w1/X)

6
Likelihood ratio
  • likelihood ratio function
  • P(x/w1) distribution of x under w1
  • P(x/w2) distribution of x under w2
  • These distributions may be known or unknown.
    Classification depend on this properties

7
Gaussian case, Change in mean
  • Univariate density
  • Density which is analytically tractable
  • Continuous density
  • A lot of processes are asymptotically Gaussian
  • Handwritten characters, speech sounds are ideal
    or prototype corrupted by random process (central
    limit theorem)
  • Where
  • ? mean (or expected value) of x
  • ?2 expected squared deviation or
    variance

8
Gaussian case, Change in mean
W1 N(?1, ?2) W2 N(?2, ?2)
9
Exercise
  • Select the optimal decision where
  • ?1, ?2
  • P(x ?1) N(2, 0.5) (Normal
    distribution)
  • P(x ?2) N(1.5, 0.2)
  • P(?1) 2/3
  • P(?2) 1/3

2
10
Interpretation
11
Multidimensional casew1N(µ1,S1), w2N(µ2,S2)
  • General equation

12
Equal covariance matrix S1 S2 I
  • Simplify
  • Classification is in term pf the distance from
    the novel input X to each mean. Separation curve
    is linear

13
(No Transcript)
14
Equal covariance matrix S1 S2 S
The curve between classes is linear
15
m1
m2
Le Discriminateur de Bayes est linéaire...
16
General case S1 S2
  • Classification is based on mahalonobis Distance.
    Distance between x and the mean weighted by S.
    Surface may be quadratic, elliptic.

17
(No Transcript)
18
(No Transcript)
19
Bayesian Decision Theory Continuous Features
  • Generalization of the preceding ideas
  • Use of more than one feature
  • Use more than two states of nature
  • Allowing actions and not only decide on the state
    of nature
  • Introduce a loss of function which is more
    general than the probability of error

2
20
  • Allowing actions other than classification
    primarily allows the possibility of rejection
  • Refusing to make a decision in close or bad
    cases!
  • The loss function states how costly each action
    taken is

2
21
  • Let ?1, ?2,, ?M be the set of M states of
    nature (categories)
  • Let ?1, ?2,, ?a be the set of possible
    actions
  • Let ?(?i ?j) be the loss incurred for taking
    action ?i when the state of nature is ?j

2
22
  • Overall risk
  • R Sum of all R(?i x) for i 1,,a
  • Minimizing R Minimizing R(?i x) for i
    1,, a

  • for i 1,,M

Conditional risk
2
23
  • Select the action ?i for which R(?i x) is
    minimum
  • R is minimum and R in this case is
    called the Bayes risk best performance that
    can be achieved!

2
24
  • Two-category classification
  • ?1 deciding ?1
  • ?2 deciding ?2
  • ?ij ?(?i ?j)
  • loss incurred for deciding ?i when the true state
    of nature is ?j
  • Conditional risk
  • R(?1 x) ??11P(?1 x) ?12P(?2 x)
  • R(?2 x) ??21P(?1 x) ?22P(?2 x)

2
25
  • Our rule is the following
  • if R(?1 x) lt R(?2 x)
  • action ?1 decide ?1 is taken
  • This results in the equivalent rule
  • decide ?1 if
  • (?21- ?11) P(x ?1) P(?1) gt
  • (?12- ?22) P(x ?2) P(?2)
  • and decide ?2 otherwise

2
26
  • Likelihood ratio
  • The preceding rule is equivalent to the following
    rule
  • Then take action ?1 (decide ?1)
  • Otherwise take action ?2 (decide ?2)

2
27
  • Optimal decision property
  • If the likelihood ratio exceeds a threshold
    value independent of the input pattern x, we can
    take optimal actions

2
28
Exercise
  • Select the optimal decision where
  • ?1, ?2
  • P(x ?1) N(2, 0.5) (Normal
    distribution)
  • P(x ?2) N(1.5, 0.2)
  • P(?1) 2/3
  • P(?2) 1/3

2
29
Classification with rejection
  • X can be classified in ?1, ?2, ?3, ?4. ?M
  • X can be rejected classified in ?0
  • Let us define ?ii ?(?i ?i)0 and
  • ?ij ?(?i ?j)1
  • ?0j ?(?0?j)Cr fixe
  • For the Class 0

30
Classification with rejection
  • For the Class i

31
Limit of Cr
32
Case of 2 Gaussian classes
Write a Comment
User Comments (0)
About PowerShow.com