Detection - PowerPoint PPT Presentation

1 / 56
About This Presentation
Title:

Detection

Description:

Detection Chia-Hsin Cheng Outlines Detection Theory Simple Binary Hypothesis Tests Bayes Criterion The MAP Criterion The ML Criterion Neyman-Pearson Criterion M ... – PowerPoint PPT presentation

Number of Views:83
Avg rating:3.0/5.0
Slides: 57
Provided by: VINC87
Category:

less

Transcript and Presenter's Notes

Title: Detection


1
Detection
  • Chia-Hsin Cheng

2
Outlines
  • Detection Theory
  • Simple Binary Hypothesis Tests
  • Bayes Criterion
  • The MAP Criterion
  • The ML Criterion
  • Neyman-Pearson Criterion
  • M Hypotheses
  • Composite Hypothesis
  • GLRT (Generalized LRT)
  • The General Gaussian Problem
  • Course Information

3
Detection Theory
  • Example Know Signal in Noise Problem

Decision rule
decision
We are faced with the problem of decision which
of two possible signals was transmitted. Detection
problem observes r(t) and guess whether s1(t)
or s2(t) was sent.
4
(No Transcript)
5
Classical Detection Theory
  • The source generates outputs of two choices
    (hypotheses), H0 and H1.We do not know which
    hypothesis is true.
  • The transition mechanism can be viewed as a
    device that knows which hypothesis is true.(i.e.,
    channel model, likelihood function)
  • Based on this knowledge, it generates a point in
    the observation space according to some
    probability law.

6
Classical Detection Theory
  • Example symbol rate sampling
  • Example over the symbol rate sampling
  • We confine our discussion to problems in which
    the observation space is finite-dimensional. The
    observations consist of a set of N numbers and
    can be represented as a point in a N-dimensional
    space.

7
  • After observing the outcome in the observation
    space, we shall guess which hypothesis is true,
    and to accomplish this, we develop a decision
    rule that assign each point in the observations
    space to one of the hypotheses.

8
Simple Binary Hypothesis Tests
  • We assume that the observation space corresponds
    to a set of N observations
    , or in a vector r ,
  • The probabilistic transition mechanism generates
    points in accord with the two known conditional
    probability densities
  • and
    . The objective is to use this
    information to develop a suitable decision rule.
  • Example

9
Decision Criteria
  • In the binary hypothesis problem, we know that
    either H0 or H1 is true. Each time the experiment
    is conducted, one of things can happen

10
Bayes Criterion
  • Two assumptions
  • 1. A Priori probabilities are known
  • 2. Costs are assigned Cij
  • We should like to design our decision rule so
    that on the average the cost will be as small as
    possible.
  • Average cost risk

11
Bayes Criterion (cont.)
  • Because the decision rule must say either H1 or
    H0,we can view it as a rule for dividing the
    total observation space Z into two parts Z0 and
    Z1. When an observation falls in Z0, we say H0,
    and whenever and observation falls in Z1,we say
    H1.
  • Optimal Bayes test design Z0 and Z1 to minimize

12
Bayes Criterion (cont.)
  • The risk function of (1) can be written in terms
    of the transition probabilities and the decision
    regions
  • We shall assume throughout our work that the cost
    of a wrong decision is higher than the cost of a
    correct decision, i.e.,

(2)
(3)
13
Bayes Criterion (cont.)
  • To find the Bayes test, we must choose the
    decision regions Z0 and Z1 in such a manner that
    the risk will be minimized. Because we require
    that a decision be made, this means that we must
    assign each point R in the observation space Z to
    Z0 or Z1 .
  • Thus
  • Rewriting (2), we have
  • Observing that
  • Substituting into (5)

(6)
14
Bayes Criterion (cont.)
  • Then, we have
  • The first two terms of (7) represent the fixed
    cost. The assumptions in (3) imply that the two
    terms inside the brackets are positive, the
    second term is larger than the first should be
    included in Z0 because they contribute a negative
    amount to the integral.
  • The decision regions are defined by the
    statement

(7)
H1
(8)
15
Bayes Criterion (cont.)
(9)
Likelihood ratio
(10)
  • The quantity on the right of (9) is the threshold
    of the test and is
  • denoted by

16
Bayes Criterion (cont.)
  • The Bayes criterion leads us to a likelihood
    ratio test (LRT)
  • Because the natural logarithm is a monotonic
    function, and both sides of (11a) are positive,
    an equivalent test is (log LRT)

(11a)
(11b)
17
The MAP Criterion
  • A priori (before we observe R r) P0 and P1
  • A posteriori (after we have observed R r)
  • When C10C011, C00C110
  • Form (9) and dividing by Pr(R)
  • MAP(maximum a posteriori probability)
    Criterion

18
The ML Criterion
  • The possible likelihoods of r
    and
  • When P0 P1 1/2, C10C011,and C00C110

Form (9)
ML(maximum likelihood) Criterion
MAP Criterion ML Criterion (when all Pi are
the same)
19
Bayes Criterion Example
  • Example We assume that under H1 the source
    output is a constant voltage m and that under
    H0 the source output is zero. Before observation
    that voltage is corrupted by an Gaussian noise.

because the noise samples are Gaussian.
20
Bayes Criterion Example (cont.)
21
Bayes Criterion Example (cont.)
  • The likelihood ratio test is

Thus, the log LRT is

or , equivalently

22
Bayes Criterion Example (cont.)
  • If C00 C11 0 and C01 C10 1, the risk
    function of (5) reduces to the probability of
    error
  • i.e., the Bayes test is minimizing the
    total probability of error.
  • When the decision regions are chosen, the values
    of the integrals in (5) are determined. We denote
    the probabilities of false alarm, detection, and
    miss, respectively, as

23
Bayes Criterion Example (cont.)
  • For any choice of decision regions, the risk
    function can be written from (5) as
  • Because
  • Then

24
Neyman-Pearson Criterion
  • In many physical situations, it is difficult to
    assign realistic costs or a priori probabilities.
    A simple procedure to bypass this difficulty is
    to work with the conditional probabilities PF and
    PD.

25
Neyman-Pearson Criterion (cont.)
  • For any positive value of ? an LRT will minimize
    F. (A negative value of gives an LRT with the
    inequalities reversed)
  • Thus F is minimized by the likelihood radio test
  • To satisfy the constraint we choose ? so that
    , i.e.,
  • Observe that decreasing ? is equivalent to
    increasing Z1 thus PD increase as ? decreases.

(12)
Solving (12) for ? gives the threshold.
26
Neyman-Pearson Criterion (cont.)
21
???
27
Q-function
  • Gaussian (normal) distribution
  • erfc-function
  • Q-function

The pdf of a Gaussian or normal distribution
The cumulative distribution function (CDF) of a
N(0,1)
The complementary CDF of a N(0,1)
28
Q-function (cont.)
29
Q-function (cont.)
30
Neyman-Pearson Criterion (cont.)
31
(No Transcript)
32
Receiver operating characteristic(ROC)
33
Summary
  • Using either a Bayes criterion or a
    Neyman-Pearson criterion, we find that the
    optimum test is a likelihood ratio test. Thus,
    regardless of the dimensionality of the
    observation space, the test consists of comparing
    a scalar variable with a threshold.
  • In many cases, construction of the LRT can be
    simplified if we can identifies a sufficient
    statistic.
  • A complete description of the LRT performance was
    obtained by plotting the conditional
    probabilities PD and PF as the threshold was
    varied.

34
M Hypotheses
  • In the simple M-ary test, there are M source
    outputs, each of which corresponds to one of M
    hypotheses. As before, we are forced to make a
    decision.
  • The Bayes criterion assigns a cost to each of the
    alternatives, assumes a set of a priori
    probabilities , and
    minimizes the risk.
  • The cost Cij denotes that the i-th hypothesis is
    chosen and the j-th hypothesis is true.

35
M Hypotheses (cont.)
  • The risk function for the M-ary hypothesis
    problem is
  • To find the optimum test, we vary the Zi to
    minimize R. We
  • consider the case of M3 below.

36
Noting that Z0Z-Z1-Z2, because the regions are
disjoint, we obtain
37
(No Transcript)
38
Bayes criterion
The optimum Bayes test becomes
(I)
(II)
(III)
  • We see that the decision rules correspond to
    three lines in the ?1, ?2 plane.
  • It is easy to verify that these three lines
    intersect at a common point.

39
(III)
(II)
(I)
40
1
41
(No Transcript)
42
When , MAP? ML
43
Some Points of M-ary Detection
  • The minimum dimension of the decision space is no
    more than M-1. The boundaries of the decision
    regions are hyperplanes in the(?1, , ?M-1).
  • A particular test of importance is the minimum
    total probability of error test. Here we compute
    the a posteriori probability of each hypothesis
    Pr(HiR) and choose the largest.

44
Composite Hypothesis
  • Example

45
Composite Hypothesis (cont.)
  • If ? is a random variable with a know pdf and
    the probability density of ? on the two
    hypotheses as
  • the likelihood ratio is

Above ex Let ? M
(14)
Reduce the problem to a simple hypothesis-testing
problem ( knowing a pdf of ? )
46
Composite Hypothesis (cont.)
  • Example (continued) We assume that the
    probability density governing m on H1 is
  • Then,
  • Integrating and taking the logarithm of both
    sides, we obtain

47
GLRT (Generalized LRT)
  • Using ML (maximum likelihood) estimate the value
    of ? under the two hypotheses (H0, H1), the
    result is called a generalized likelihood ratio
    test

where ? 1, ranges over all ? in H1 and ? 0,
ranges over all ? in H0. In other words, we make
a ML estimate of ? 1 , assuming that H1 is true.
We then evaluate for
and use this value in the numerator. A similar
procedure gives the denominator.
48
The General Gaussian Problem
  • Definition A set of random variables
    are defined as jointly Gaussian if
    all their linear combinations are Gaussian random
    variables.
  • Definition A vector r is a Gaussian random
    vector when its components
    are jointly Gaussian random variables.
  • Definition A Gaussian random vector r is
    characterized by its mean m and covariance matrix
    , i.e.,

49
  • Consider the following binary hypothesis problem

(15)
50
? Equal covariance matrices
(17)
(17)
51
we defined d as the distance between the means on
the two hypothesis when the variance was
normalized to equal one.
(18)
The performance of this binary detection problem
depends on d
52
PD
d
53
  • Case1. Independent Components with Equal
    Variance.
  • Substituting to (18)

We see that d corresponds to the distance between
the two mean-value vectors divided by the
standard deviation of Ri.
54
  • Case 2. Independent Components with Unequal
    Variance.
  • Case 3. A general case.
  • PLS refer to textbook pp.101-107.

55
Course Information
  • Instructor Chia-Hsin Cheng
  • Room 525
  • Tel05-2720411 ext23240
  • E-mail vincent_at_wireless.ee.ccu.edu.tw
  • Text book
  • H.L. Van Trees, Detection, Estimation and
    Modulation Theory, Wiley, 2001, pt. I, Chap1
    chap2.
  • Reference books
  • S.M. Kay, Fundamentals of Statistical Signal
    Processing Detection Theory, Prentice Hall,
    1998, pt. II.
  • H.V. Poor, An Introduction to Signal Detection
    and Estimation, 2nd ed., Springer-Verlag, 1994.

56
Ultra wideband
  • UWB First Reading
  • 1Moe Z. Win Robert A. Scholtz, Impulse
    Radio How it works, IEEE Communication
    Letters,February 1998.
  • 2 R. A. Scholtz, Multiple access with
    time-hopping impulse modulation,in Proc. MILCOM,
    Oct. 1993.
  • 3 Durisi, G. Romano, On the validity of
    Gaussian Approximation to Characterize the
    Multiuser Capacity of UWB TH PPM, IEEE
    Conference on Ultra Wideband Systems and
    Technologies. Digest of Papers , Baltimore,
    USA,pp.157 - 161, 2002.
  • 4PlusON Technology Overview,
    http//www.timedomain.com ,July 2000.
  • 5 K. Mandke et al., The Evolution of Ultra
    Wide Band Radio for Wireless Personal Area
    Networks, High Frequency Electronics, September
    2003, pp. 22-32.
Write a Comment
User Comments (0)
About PowerShow.com