Title: Bivariate Probability Distribution
1Chapter 5
- Bivariate Probability Distribution
2Section 5.2
- Bivariate Probability Distributions
3Bivariate Probability Distributions
- Joint Discrete Random Variables
- Joint cumulative distribution function (cdf)
F(y1,y2) - Joint probability mass function (pmf) p(y1,y2)
- Joint Continuous Random Variables
- Joint cumulative distribution function (cdf)
F(y1,y2) - Joint probability density function (pdf) p(y1,y2)
4Joint probability mass function (pmf)
- Definition 5.1 Let Y1 and Y2 be discrete random
variables. The joint (or bivariate) probability
mass distribution for Y1 and Y2 is given by - p(y1,y2)P(Y1y1, Y2y2)
- where -??y1 ??, -??y2 ??
- The function p(y1,y2) is called the joint
probability mass function (pmf) of Y1 and Y2 .
5Properties of pmf
- Theorem 5.1 If Y1 and Y2 be discrete random
variables with joint probability mass
distribution p(y1,y2), then - (1) p(y1,y2) ?0 for all y1,y2
- (2) ?y1 ?y2 p(y1,y2) 1, where the sum is over
all values (y1,y2) that are assigned nonzero
probabilities.
6Cumulative distribution function( cdf)
- Definition 5.2 For any random variables Y1 and
Y2, the joint (bivariate) distribution function
(or cumulative distribution function, cdf)
F(y1,y2) is given by - F(y1,y2)P(Y1?y1, Y2?y2)
- where -??y1 ??, -??y2 ??
7Joint Probability Density Function (pdf)
- Definition 5.3 Let Y1 and Y2 be continuous
random variables with joint distribution function
F(y1,y2). If there exists a nonnegative function
f(y1,y2) such that -
-
-
- then Y1 and Y2 are said to be jointly continuous
random variables. The function f(y1,y2) is called
the joint probability density function (pdf).
8Properties of cdf
- Theorem 5.2 If Y1 and Y2 are random variables
with joint cumulative distribution function
F(y1,y2), then - (1) F(-?,-?)F(-?,y2)F(y1,-?)0
- (2) F(?, ?)1
- (3) If y1?y1 and y2?y2, then
- F(y1,y2)-F(y1,y2)-F(y1,y2)F(y1,y2) ?0
9Properties of pdf
- Theorem 5.3 If Y1 and Y2 are jointly continuous
random variables with a joint density function
(pdf) given by f(y1,y2), then - (1) f(y1,y2) ?0 for all y1, y2
- (2)
10Section 5.3
- Marginal Probability Distribution
- Conditional Probability Distribution
11Marginal Probability Functions
- Definition 5.4
- (1) Let Y1 and Y2 be jointly discrete random
variables with joint probability mass function
p(y1,y2). Then the marginal probability functions
of Y1 and Y2, respectively, are given by - (2) Let Y1 and Y2 be jointly continuous random
variables with joint density function f(y1,y2).
Then the marginal density function of Y1 and Y2,
respectively, are given by -
-
12Conditional Discrete Probability Function
- Definition 5.5 If Y1 and Y2 are discrete random
variables with joint probability mass
distribution p(y1,y2) and marginal probability
function p1(y1) and p2(y2), respectively, then
the conditional discrete probability function of
Y1 given Y2 is
13Conditional Cumulative Distribution Function
- Definition 5.6 If Y1 and Y2 are jointly
continuous random variables with joint
probability density function f(y1,y2), then the
conditional cumulative distribution function of
Y1 given Y2y2 is
14Conditional Probability Density Function
- Definition 5.7 If Y1 and Y2 are jointly
continuous random variables with joint
probability density function f(y1,y2) and
marginal density function f1(y1) and
f2(y2),respectively. For any y2 such that
f2(y2)gt0, the conditional density function of Y1
given Y2y2 is given by - and, for any y1 such that f1(y1)gt0, the
conditional density of Y2 given Y1y1 is given by
15Section 5.4
- Independent Random Variables
16Independent Random Variables
- Definition 5.8 Let Y1 and Y2 have marginal
cumulative distribution function F1(y1)and
F2(y2), respectively, and Y1 and Y2 have joint
cumulative distribution function F(y1, y2). Then
Y1 and Y2 are said to be independent if only if - F(y1, y2) F1(y1) F2(y2)
- for every pair of real numbers (y1, y2).
17Properties of Independent Random Variables
- Theorem 5.4 If Y1 and Y2 are discrete random
variables with joint probability mass
distribution p(y1,y2) and marginal probability
function p1(y1) and p2(y2), respectively, then Y1
and Y2 are the independent if only if - p(y1,y2) p1(y1) p2(y2)
- for all pairs of real numbers (y1,y2).
- If Y1 and Y2 are continuous random variables with
joint probability density function f(y1,y2) and
marginal densities functions f1(y1) and
f2(y2),respectively, then Y1 and Y2 are
independent if only if - f(y1,y2) f1(y1) f2(y2)
- for all pairs of real numbers (y1,y2).
18Theorem for Independence
- If Y1 and Y2 have joint probability density
function f(y1,y2) and - f(y1,y2)gt0, if a y1 b and cy2d
- f(y1,y2)0, elsewhere
- Then Y1 and Y2 are independent if only if
- f(y1,y2) g(y1) h(y2)
- where g(y1) is a nonnegative function of y1
alone and h(y2) is nonnegative function of y2
alone.
19Section 5.5
- Expected Value of Function of Random Variables
20Expected Value of Function of Random Variables
- Definition 5.9
- (1) Let g(Y1,Y2 ) be a function of the discrete
random variables, Y1,Y2 with joint probability
mass function p(y1,y2). Then the expected value
of g(Y1,Y2 ) is -
- (2) If Y1 and Y2 are continuous random variables
with joint density function f(y1,y2). Then the
expected value of g(Y1,Y2 ) is -
-
21Section 5.6
- Properties of Expected Value
22Properties of Expected Value
- Theorem 5.6 Let c be a constant, then Ecc
- Theorem 5.7 Let g(Y1,Y2 ) be a function of the
random variables Y1 and Y2, and let c be a
constant. Then - Theorem 5.8 Let g1(Y1,Y2 ), g2(Y1,Y2 ),,
gk(Y1,Y2 ) be functions of random variables Y1
and Y2, and let c1, c2,, c1 be constants. Then
23Expectation of Independent Random Variables
- Let Y1 and Y2 be independent random variables and
g(Y1) and h(Y2) be functions of only Y1 and Y2,
respectively. Then - provided that the expectations exists.
24Section 5.7
- Covariance of Two Random Variables
25Covariance and Correlation Coefficient
- Definition 5.10 If Y1 and Y2 are random
variables with means and standard deviations
?1,?1 and ?2, ?2, respectively, the covariance of
Y1 and Y2 is given by - Cov(Y1,Y2)E(Y1- ?1)(Y2- ?2)
- Correlation Coefficient
- Values of ? and Interpretation -1 ? 1
26Calculation of Covariance
- Theorem 5.10 Let Y1 and Y2 are random variables
with means ?1 and ?2 , respectively, then - Cov(Y1,Y2)E(Y1- ?1)(Y2- ?2)
- EY1Y2 - EY1EY2
27Covariance of Independent Random Variables
- Theorem 5.11 Let Y1 and Y2 are independent
random variables , then - Cov(Y1,Y2)0
- If the covariance of two random variables is
zero, the variables need not be independent.
28Homework
- p253 5.75, 5.77, 5.78, 5.81
29Section 5.8
- Expected Value and Variance of Linear Functions
of Random Variables
30Expected Value and Variance of Linear Functions
of Random Variables
- Let Y1,Y2 , Yn and X1, X2 ,Xm be random
variables with EYi?i and EXj?j. Define - for constants a1, a2 , an and b1, b2 , bm.
Then the following hold - (a)
- (b)
- (c)
31Expected Value and Variance of Linear Functions
of Two Random Variables
- Let Y1,Y2 be random variables with EY1?1 and
EY2?2. Define - for constants a1, a2. Then the following hold
- (a)
- (b)
- Special case a1a21, then
32Homework