Title: Chapter 2' Discrete Random Variables
1Chapter 2. Discrete Random Variables
2Random Variable
- A random variable X(?) is a real-valued function
defined for points ? in a sample space ? - Example If ? is the whole class of students,
and ?is an individual student, we may define X(?)
as the height of an individual student (in feet) - Question What is the probability of a students
height between 5 feet and 6 feet?
- Define B5, 6. Our goal is to find the
probability - P(?? ? 5 ? X(?) ? 6) P(?? ? X(?) ?B)
P(X?B) - In general, we are interested in the probability
P(X?B) or for convenience, P(X?B). - If B xo is a singleton set, we may simply
write P(Xxo. - Example 2.1
- P(a ? X? b) P(X? b)? P(X? a)
- Example 2.2
- P(X0 or X 1) P(X0) P(X1)
3Discrete Random Variable
- X is a discrete random variable if there exist a
set of distinct real numbers xi such that -
- If B is a subset of real numbers, then
- where
- is the indicator function.
- If xi are integers, then X is an integer-valued
random variable. - Example
- In tossing a coin, we may define X 0 if the
outcome is tail, and X 1 if the outcome is
head. - In tossing a fair dice, we may define X to be the
outcome of toss. In particular, since the
probability is the same for each face, we have - P(X k) 1/6, k1,2, , 6
4Multiple Random Variables
- Let X and Y be random variables defined on points
of the same sample space ?, and B and C be two
subset of real numbers, our goal is to find - P(X?B, Y?C)
- P(X?B, Y?C)
- P(??? X(?)?B, Y(?)?C)
- P(X?B ? Y?C)
- P(??? X?B ? ??? Y?C)
- X and Y are independent random variables, iff
- P(X?B, Y?C) P(X?B)P(Y?C)
- N random variables X1, X2, , Xn are independent
random variables iff - for all sets B1, B2, , Bn.
- If for every B, P(Xi ?B) does not depend on i for
all i, then Xis are identically distributed. If
Xis are also independent, they are said to be
i.i.d.
5Example 2.5
- Given
- 10 phones in neighborhood. of phones
simultaneously used is random. The usage at
different days are independent. - Find
- P(A) P(On all five days at 9AM, of phones
used lt 3) - P(B) P(On one or more days of the five days at
9AM, phones used ? 2) - Solution Denote Xi to be of phones used at 9AM
on the i-th day for i 1, 2, 3, 4, and 5.
- Here ? 0, 1, , 10, and Xi(?) ? ? ?.
- Xi are identically distributed (same for each
day), and are independent (given). Hence, they
are i.i.d. -
6A Max. Min. Example
- Example 2.6 Let X1, X2, , Xn be a set of n
independent random variables. Evaluate - P(max(X1, X2, , Xn ) ? z) and
- P(min(X1, X2, , Xn ) ? z)
- Answer Note that the event
-
- Hence
7More r.v. Examples
- Example 2.7 (rephrased) In an experiment, a
series of trials of coin-tossing are performed.
In the k-th trail, if the head side is up, the
k1-th trail will be performed If the tail side
is up, trial stops. Let T be the total number of
trials until stop. Find P(Tk). - Solution Let Xi be the outcome of ith trial. If
head side is up, Xi 1 if tail side is up, Xi
0. Then
- Since Xi 1 ? i ? k are k independent,
identically distributed random variables, - In this example, the sample space ? consists of
the outcomes of tossing a coin form 0 to infinite
many times. ?1 is the outcome of a single trial
with the head side up. ?k is the outcome of k-1
trials of tail side up followed by a head side
up. - Tk of trails in ?k
8Types of Random Variables
- Bernoulli Random Variable
- X Bernoulli(p) if X ? 0,1, and PX 1 p.
- Geometric Random Variable
- X Geometric0(p) if X ? 0, 1, 2, and
- PX k (1-p)pk, k 0,1,
- X Geometric1(p) if X ? 0, 1, 2, and
- PX k (1-p)pk-1, k 1, 2,
- Poisson Random Variable
- XPoisson(?) if for ?gt0,
- k 0, 1, 2,
- Binomial Random Variable
- Xbinomial(n,p)
- k 0,1, , n
Demonstration matlab mfile rv.m
9Plots of PMFs
10Probability Mass Functions (PMF) and Expectations
- Probability mass function (PMF) is defined on a
discrete random variable X by pX(xi) P(X xi) - Hence,
- Joint PMF of X and Y
- Marginal PMF
- Expectations (mean, average)
11Properties of Expectations
- If X is a r.v. and B is any set, then the
indicator function IB(X) is a Bernoulli r.v. and - EIB(X) P(IB(X)1) P(X?B)
- Law of the unconscious statistician (LOTUS)
- If a r.v. X has a p.m.f. pX(x), and g(x) is a
real-valued function of the real variable x, then
- Use LOTUS, we have
- EX Y EX EY
- EaX aEX where a is a constant.
- Hence, expectation is a linear operator.
- If X and Y are independent random variables, and
g(X), h(Y) are functions of X, Y respectively,
then - Eg(X)h(Y) Eg(X) Eh(Y)
12Examples of Expectations
- Example 2.10 Mean of a Bernoulli r.v. X is
- EX 0 (1-p) 1 (p) p
- Example 2.11 Mean of a Poisson r.v. X is
13Moments and Standard Deviation
- n-th moment EXn
- Defined over a real-valued random variable X.
- Standard Deviation var(X)
- Let m EX, then
- VarX E(X-m)2
- EX2 2Xm m2
- EX2 2mEX m2
- EX2 m2
- EX2 (EX)2
- Example 2.13 Find the EX2 and var(X) of a
Bernoulli r.v. X - EX2 02 (1 p) 12 p p
- Since EX p, thus,
- Var(X) EX2 (EX)2
- p (p)2 p(1 p)
- Example 2.14 Let X poisson(?).
- Since EX(X 1) ?2, we have EX2 ?2 ?.
Thus, - var(X) (?2 ?) ?2 ?.
14Probability Generating Function
- Let X be a random variable taking only
non-negative integer values. The probability
generating function (pgf) of X is defined as - GX has a power series expansion with radius of
convergence of 1. That is, GX(z) ? 1 for z ?
1.
- Generate p.m.f.s
- Generate factorial moments
-
15Examples of PGF
- Example 2.16 A communication system has n links.
P(link i fails) p. Find - P(exactly k links fail) ?
- Solution Let Xi 1 if link i fails and 0
otherwise. Then Xi Bernoulli(p) and are iid.
Define - Y X1 X2 Xn
- P(exactly k links fail) P(Yk)
- Now, use expectation property
- GY(z) EzX1Xn
- EzX1EzX2 EzXn
- But EzXi z0 (1 p)z1 p
- (1 p) p z
- Hence, GY(z) (1 p pz)n
- Thus, for k ? n,
- and pY(k) 0 for k gt n. Y is a binomial random
variable!
16Conditional Probability
- The conditional probability is defined as
follows - In terms of pmf, we have
- Example 2.17 Let X message to be sent (an
integer). For X i, light intensity ?i is
directed at a photo detector. Y Poisson(?i)
of photo-electrons generated at the detector. - Solution for n 0, 1, 2,
- Thus, P(Ylt2Xi) P(Y0Xi) P(Y1Xi)
-
17Law of Total Probability Substitution Law
- Law of Total Probability Recall
- Similarly, we have
- Now, let Z X Y, then
- Substitution Law
- Thus,
- If X and Y are independent such that pYX(yx)
pY(y), then pmf of Z is the convolution of those
of X and Y!
18Examples of Law of Total Probability
- Example 2.19 YPoisson(k), where the sample size
k is a r.v. with Xgeometric1(p). Find P(Y0) and
P(X1Y0). - Solution Note that for n 0, 1,
- Thus, P(Y0Xk)e?k. Use Law of total
probability, we have
19Example of substitution Law
- Example 2.21
- A random, integer valued signal X is
transmitting over a noisy channel and suffers
integer-valued, additive noise Y indep. of X. The
received signal is Z X Y. Find the
conditional pmf pXZ. - Solution
- Let XpX, and YpY. Then,
- Using substitution law, we have
- The last equality is because Y and X are
independent. Thus, - Therefore,
20Conditional Expectation
- Conditional Expectation
- Since P(Xxk,Xxi)0 for k? i,
- Use substitution law, we have
- Use law of total probability,
21Example on Conditional Expectation
- Problem of defects on a chip, XPoisson(?).
Each defect may fall within a region R on the
chip with probability p. The location of defects
are independent to each other.Find the expected
value of of defects in R, denoted by Y - Solution Given Xk defects on chip, Yj ?k of
them fall in R is a binomial random variable
Hence, EY p?. Alternatively,