Continuous Random Variables and Their Probability Distributions - PowerPoint PPT Presentation

1 / 27
About This Presentation
Title:

Continuous Random Variables and Their Probability Distributions

Description:

Definition: For a continuous random variable Y with distribution function F(y) ... MGF: my(t)=(1- t)- Special cases: ... MGF: my(t)=(1-2t)- /2. 18. Exponential ... – PowerPoint PPT presentation

Number of Views:53
Avg rating:3.0/5.0
Slides: 28
Provided by: Mathematic91
Category:

less

Transcript and Presenter's Notes

Title: Continuous Random Variables and Their Probability Distributions


1
Chapter 4
  • Continuous Random Variables and Their Probability
    Distributions

2
Sec 4.2
  • Distribution Functions

3
Cumulative Distribution Function
  • Definition Let Y be any random variable. The
    distribution function (cumulative distribution
    function) for Y, F(y), is defined by, F(y)
    P(Y?? y).
  • Note This function is defined for both
    continuous and discrete random variables.

4
Properties of any cdf F(x)
  • Let Y be a random variable with distribution
    function F(y). Then
  • For all a and b, if a lt b then F(a) ? F(b).
    (F is a nondecreasing function)
  • (lower
    limit of F is 0)
  • (upper
    limit of F is 1)
  • (F is
    right continuous)
  • Let Y be be a random variable with distribution
    function F(y). Then for a lt b Pa lt Y ? b
    F(b) F(a).

5
Continuous Random Variables
  • Definition A random variable Y is a continuous
    random variable if its distribution function F(y)
    is continuous for all y?R.
  • Definition For a continuous random variable Y
    with distribution function F(y) the probability
    density function (pdf) of Y is defined by f(y)
    (d/dy)F(y) F'(y). Consequently,

6
Properties of pdf
  • If f(y) is the density function of a continuous
    random variable Y, then
  • f(y) ? 0 for all y?R
  • ?f(y)dy 1 where the integral is over (-?, ?)
  • P(Ya) 0 for any real number a
  • If altb, then

7
Sec 4.3
  • Expected Value

8
Expected Value of Y
  • Definition Let Y be a continuous random
    variable with pdf f(y) then the expected value of
    Y, E(Y), is defined by E(Y) ? y f(y) dy
    provided the integral converges absolutely.
  • The expected value of Y is denoted by ?.

9
Functions of Random Variables
  • Theorem Let Y be a continuous random variable.
    Suppose that for some real valued function g the
    random variable X g(Y) is continuous. Then the
    expectation of X is given by E(X) Eg(Y)
    ? g(y) f (y) dy provided the integral
    converges absolutely

10
Properties
  • Theorem Let c be a constant and let g1(Y),
    g2(Y),.., gk(Y) be functions of a continuous
    random variable Y. Suppose the expectations of
    g1(Y), g2(Y), , gk(Y) all exist. Then,
  • E(c) c
  • Ecg(Y) cEg(Y)
  • E(g1(Y) g2(Y) gk(Y) ) E(g1(Y))
    E(g2(Y)) E(gk(Y))
  • V(Y) E(X - ?)2 E(X2) - ?2.

11
Section 4.4-4.7
  • Uniform Probability Distribution, U(?1,?2)
  • Normal Probability Distribution, N(?,?2)
  • Gamma Probability Distribution, Ga(?, ?)
  • Chi-Square Probability Distribution, ?2(?)
  • Exponential Probability Distribution, exp(?)
  • Beta Probability Distribution, Beta(?, ?)
  • Uniform Probability Distribution,
    U(0,1)Beta(1,1)

12
The Uniform Distribution
  • The continuous random variable Y has uniform
    distribution of the interval (?1, ?2) if the
    density function of Y is f(y) 1/(?2 - ?1)
    for y ? (?1,?2) and 0 otherwise
  • E(Y) (?1 ?2)/2
  • V(Y) (?2- ?1)2/12

13
The Normal Distribution
  • The continuous random variable Y has a normal
    distribution if the density function of Y is
    for all y? (-?,?)
  • E(Y) ?
  • V(Y) ?2

14
Calculation of Normal Probability Density
Function (pdf)
  • Standard Normal pdf , Denoted by N(0,1)
  • Z(Y-?)/?
  • Find scores, given probability
  • Find proportion/probability, given the scores

15
Gamma Distribution
  • A random variable Y is said to have a gamma
    distribution with parameters ?gt0 and ?0 if and
    only if the density function of Y is

16
Mean and Variance of Gamma Probability
Distribution
  • Mean ?EY??
  • Variance ?2VY??2
  • MGF my(t)(1-?t)-?
  • Special cases
  • Chi-square distribution with ? degrees of freedom
    if ? ? /2 and ?2
  • Exponential distribution when ? 1

17
Chi-square Probability Distribution
  • A random variable Y is said to have a chi-square
    distribution with ? degrees of freedom if and
    only if the density function of Y is

Mean ? EY? , Variance ?2VY 2 ? MGF
my(t)(1-2t)-? /2
18
Exponential Probability Distribution
  • A random variable Y is said to have an
    exponential distribution with parameter ?gt0 if
    and only if the density function of Y is

Mean ? EY? , Variance ?2 VY ?2 MGF
my(t)(1- ?t)-1 Memoryless property
P(YgtabYgta)P(Ygtb)
19
Homework
  • p.173-175 4.46, 4.47, 4.54,  4.57, 4.63 p180
    4.68, 4.71, 4.72, 4.76

20
Beta Probability Distribution
  • A random variable Y is said to have a beta
    probability distribution with parameters ?gt and
    ?gt0 if and only if the density function of Y is

21
Mean and Variance of Beta Y
  • Mean ? EY ?/(??)
  • Variance ?2 VY? ? /((??)2(??1))
  • MGF no closed form
  • Special cases ?? 1, Y is uniform distribution.
  • That is f(y)1 , 0?y?1
  • 0, elsewhere

22
Section 4.9
  • Moment
  • Moment of Function of Random Variables

23
Moment
  • Moment If Y is a continuous random variable,
    then the kth moment about the origin is given by
  • ?k EYk, k1,2,
  • The kth moment about the mean, or the kth central
    moment, is given by
  • ?k E(Y- ?)k, k1,2,

24
Moment Generating Function (MGF)
  • MGF If Y is a continuous random variable, then
    the moment-generating function of Y is given by
  • my(t)EetY
  • The moment-generating function is said to exist
    if there exists a constant bgt0 such that my(t) is
    finite for t?b.
  • The kth moment about origin is

25
MGF of Function
  • Theorem Let Y be a random variable with density
    function f(y) and g(y) be a function of Y. then
    the moment-generating function for g(Y) is

26
Homework
  • p185 4.91, 4.92p193-194 4.104, 4.108, 4.113

27
Section 4.10
  • Tchebysheffs Theorem
  • (Chebyshevs Inequality)
  • Let Y be any random variable with finite mean
    ? and variance ?2. Then, for any kgt0, P(Y- ?
    lt k?) ? 1 (1/k2)or P(Y- ? ? k?) ? (1/k2)
  • Compare it to 68-95-99.7 rules for normal
    distribution
Write a Comment
User Comments (0)
About PowerShow.com