010'141 Engineering Mathematics II Lecture 56 Joint Distributions - PowerPoint PPT Presentation

1 / 26
About This Presentation
Title:

010'141 Engineering Mathematics II Lecture 56 Joint Distributions

Description:

Independent Variables Example. Two people decide to meet at a certain location. ... X and Y are independent variables uniformly distributed over (0,60) We need ... – PowerPoint PPT presentation

Number of Views:34
Avg rating:3.0/5.0
Slides: 27
Provided by: scSn
Category:

less

Transcript and Presenter's Notes

Title: 010'141 Engineering Mathematics II Lecture 56 Joint Distributions


1
010.141 Engineering Mathematics IILecture
5/6Joint Distributions
  • Bob McKay
  • School of Computer Science and Engineering
  • College of Engineering
  • Seoul National University
  • Partly based on
  • Sheldon Ross A First Course in Probability

2
Outline
  • Joint Distributions
  • Independent Random Variables
  • Summing Distributions
  • Conditional Distributions
  • Discrete
  • Continuous
  • Mixed
  • Order Statistics
  • For a very careful development of Joint
    Distributions, see Don Macleish (University of
    Waterloo), Stats 901
  • http//www.stats.uwaterloo.ca/dlmcleis/s901/

3
Cartesian Product Random Variables
  • Recall that a real random variable is a
    measurable function X ? ? R over a probability
    space (?, ?,P), and that P(X x) is just an
    abbreviation for the value P(? ? ? X(?) x)
  • Now any probability space (?, ?,P) can be used to
    generate a new probability space (? x ?, ?2,P2)
  • The details of this are messy rather than
    difficult, so we will skip them
  • Hence we can define random variables of the form
    Z ? x ? ? R over (? x ?, ?2,P2)

4
Joint Distributions
  • From Z ? x ? ? R , we can construct its
    projections
  • ZX ? ? R ZY ? ? R
  • ZX (?) Z (?, ?) Zy (?) Z (?, ?)
  • ZX ZY are both random variables over (?, ?,P)
  • Usually, we will just talk of X for ZX and Y for
    ZY and refer to Z as their joint distribution
  • But this is careless, because different Zs can
    generate the same ZX And ZY

5
Joint Distributions
  • We can answer joint probability questions about X
    and Y from their joint distribution. Eg, we can
    compute the joint probability that X gt a and Y gt
    b via
  • PX gt a, Y gt b 1- PX gt a, Y gt b c 1-
    P(X gt ac ? Y gt b c) 1- P(X ? a ? Y ?
    b ) 1- PX ? a PY ? b - PX ? a,Y
    ? b 1 - FX(a) - FY(a) F(a,b)
  • Where FX, FY and F(a,b) are respectively the
    cumulative probability distributions of X and Y
    and their joint cumulative distribution
  • We can define a joint probability mass function
    for jointly distributed discrete random variables
    by
  • p(x,y) P(Xx, Yy)

6
Joint Distribution Example
  • The joint density function of X and Y is given by
  • f(x,y) 2e-xe-2y 0 lt x, 0 lt y
  • 0 otherwise
  • Compute P X gt 1, Y lt 1
  • P X gt 1, Y lt 1 ?01 ?1?2e-xe-2y dx dy ?01
    2e-2y (-e-x ?1?) dy e-1 ?01 2e-2y dy
    e-1(1 - e-2)

7
Joint Distribution Example
  • The joint density function of X and Y is given by
  • f(x,y) e-(xy) 0 lt x, 0 lt y
  • 0 otherwise
  • Find the cumulative density function of random
    variable X/Y
  • FX /Y(a) P X/Y ? a ??x/y ?a e-(xy) dx
    dy ?0? ?0ay e-(xy) dx dy ?0? (1-
    e-ay)e-y dy - e-y e-(a1)y / a1 ?0?
    1 - 1 / (a 1) fX /Y(a) 1 / (a 1)2 0 lt a

8
Independent Random Variables
  • Random variables X and Y are independent iffPX
    ? A, Y ? B PX ? A PY ? B
  • Which is equivalent to PX ? a, Y ? b PX ?
    a PY ? b or F(a, b) FX(a) FY(b)
  • For discrete variablesp(x, y) pX(x) pY(y)
  • For continuous variables f(x, y) fX(x) fY(y)

9
Independent Variables Example
  • Two people decide to meet at a certain location.
    If each person independently arrives at a time
    uniformly distributed between 1200 and 1300, find
    the probability that the first to arrive has to
    wait longer than ten minutes
  • Let X and Y be the number of minutes past 1200
    that the two people arrive
  • X and Y are independent variables uniformly
    distributed over (0,60)
  • We need to calculate
  • PX10 lt Y PY10 lt X 2 PX10 lt Y

10
Independent Variables Example
  • 2 PX10 lt Y 2 ??x10lty f(x,y)dx dy 2
    ??x10lty fX(x) fY(y) dx dy 2 ?1060 ?0y-10
    (1/60)2 dx dy 2/ 602 ?1060 (y - 10) dy
    25/36

11
Deriving Normal Distributions
  • Suppose X and Y are independent
  • Suppose also that the joint distribution f(x,y)
    is conditionally independent of x and y, given
    x2y2
  • That is, the value of f depends only on the
    distance from the origin
  • Then X and Y are normally distributed
  • (for proof, see Ross)

12
Summing Independent Variables
  • Suppose X and Y are independent
  • We often wish to compute the distribution of XY
  • We can calculate
  • FX Y(a) P XY ? a ??xy ?a fX(x) fY(y)
    dx dy ??-?? ?-?a-y fX(x) fY(y) dx dy
    ??-?? ?-?a-y fX(x) dx fY(y) dy ??-?? FX(a-y)
    fY(y) dy
  • This is known as the convolution of X and Y

13
Summing Independent Variables
  • Differentiating
  • FX Y(a) ??-?? FX(a-y) fY(y) dy
  • We get
  • fX Y(a) ??-?? fX(a-y) fY(y) dy

14
Summing Uniform Distributions
  • What is the sum of two uniform distributions X
    and Y, say over 0,1?
  • A first careless guess might be that it is
    uniform over 0,2
  • But more thought will quickly show that the
    probability density function must be 0 at 0 and
    2, and have a maximum at 1
  • fX Y(a) ??-?? fX(a-y) fY(y) dy ??01
    fX(a-y) dy a 0 ? a ? 1
  • 2 - a 1 ? a ? 2
  • 0 otherwise

15
Summing Gamma Distributions
  • Suppose we made the same careless guess about
    Gamma distributions?
  • In this case, it would be right!
  • So long as the two distributions have the same ?
    parameter
  • If X and Y are independent gamma random variables
    with parameters (s, ?) and (t, ?), then XY is a
    gamma random variable with parameters (st, ?)

16
Summing Normal Distributions
  • Similarly, the sum of normal distributions is
    also normal
  • If Xi, i1,,n are independent normally
    distributed random variables with parameters (?i,
    ?i2) i1,,n, then ?iXi is also normally
    distributed, with parameters (? i?i, ? i?i2)

17
Discrete Conditional Distributions
  • The obvious definition the mass function of a
    discrete conditional distribution is
  • pXY(xy) PX x Y y PX x, Y y
    / PY y p(x,y) / pY(y)
  • Similarly
  • FXY(xy) PX ? x Y y ?a?x pXY(a y)

18
Continuous Conditional Distributions
  • There is an obvious difficulty in defining
    continuous conditional distributions, in that the
    conditioning event Y y has zero probability
  • Nevertheless, by paralleling the discrete case,
    we are able to obtain workable definitions
  • fXY(xy) f(x,y) / fY(y)
  • If we apply small deviations ?x and ?y to this,
    we see that
  • fXY(xy) ?x f(x,y) ?x ?y / fY(y) ?y ? Px
    ? X ? x?x, y ? Y ? y?y / Py ? Y ? y?y
    Px ? X ? x?x y ? Y ? y?y

19
Continuous Conditional Distributions
  • From this, we get that
  • PX ? A Y y ?A fXY(xy) dx
  • So if we set A (-?, a, then
  • FXY(ay) PX ? a Y y ?-?a
    fXY(xy) dx
  • In other words, the cumulative distribution
    behaves as we might expect

20
Conditional Distributionsand Independence
  • If X and Y are independent, we would obviously
    like the conditional density of X, given Y, to be
    just the unconditional density of X
  • Fortunately
  • fXY(xy) f(x,y) / fY(y) fX(x) fY(y) /
    fY(y) fX(x)

21
Mixed Conditional Distributions
  • We can also make sense of a conditional
    distribution between a continuous variable X and
    a discrete variable N
  • Px lt X lt x?x N n / ?x PN n x lt X
    lt x?xPx lt X lt x?x / PN n ?x
  • In the limit, as ?x ? 0 PN n X x/PN
    n f(x)
  • That is,
  • fXN(xn) PN n X x/PN n f(x)

22
Beta Distributions and Conditionality
  • We have previously looked at the distribution of
    success given independent trials with a fixed
    probability of success
  • What, however, of the case where the probability
    of success is itself a random variable?
  • Suppose we conduct nm trials, where the
    probability of success is chosen from a (1,1)
    beta (uniform) distribution, and that n of them
    are successful
  • Then the probability of success (given Nn) is
    given by a (1n,1m) beta distribution

23
Order Statistics
  • While the properties of particular random
    variables are important, we are also often
    interested in properties of the biggest, the
    smallest, the middle
  • We handle this in this way given random
    variables X1,.,Xn,,we define
  • X(1) is the smallest of X1,.,Xn
  • .
  • X(j) is the jth smallest of X1,.,Xn
  • .
  • X(n) is the largest of X1,.,Xn

24
Order Statistics
  • X(1) ?. ? X(n), are known as the order
    statistics
  • The joint density functionfX(1)X (n) n!
    f(x1) f(xn)
  • Is obtained by noting that orderings correspond
    to permutations of the original variables
    X1,.,Xn
  • The cumulative distribution function is
  • FX(j)(y) n!/(n-j)!(j-1)! ?-?y
    F(x)j-11-F(x)n-j f(x) dx

25
Summary
  • Joint Distributions
  • Independent Random Variables
  • Summing Distributions
  • Conditional Distributions
  • Discrete
  • Continuous
  • Mixed
  • Order Statistics

26
?????
Write a Comment
User Comments (0)
About PowerShow.com