Discrete Math CS 2800 - PowerPoint PPT Presentation

About This Presentation
Title:

Discrete Math CS 2800

Description:

... the probability that X takes the value x as a function of x (as we saw before): 6 ... 6. 10. Probability Distribution. The probability function, properties. 11 ... – PowerPoint PPT presentation

Number of Views:47
Avg rating:3.0/5.0
Slides: 52
Provided by: csCor
Category:
Tags: discrete | math | saw

less

Transcript and Presenter's Notes

Title: Discrete Math CS 2800


1
Discrete MathCS 2800
  • Prof. Bart Selman
  • selman_at_cs.cornell.edu
  • Module
  • Probability --- Part d)

1) Probability Distributions 2) Markov and
Chebyshev Bounds
2
Discrete Random variable
  • Discrete random variable
  • Takes on one of a finite (or at least countable)
    number of different values.
  • X 1 if heads, 0 if tails
  • Y 1 if male, 0 if female (phone survey)
  • Z of spots on face of thrown die

3
Continuous Random variable
  • Continuous random variable (r.v.)
  • Takes on one in an infinite range of different
    values
  • W GDP grows (shrinks?) this year
  • V hours until light bulb fails
  • For a discrete r.v., we have Prob(Xx), i.e., the
    probability that
  • r.v. X takes on a given value x.
  • What is the probability that a continuous r.v.
    takes on a specific value? E.g.
    Prob(X_light_bulb_fails 3.14159265 hrs) ??
  • However, ranges of values can have non-zero
    probability.
  • E.g. Prob(3 hrs lt X_light_bulb_fails lt 4
    hrs) 0.1
  • Ranges of values have a probability

0
4
Probability Distribution
  • The probability distribution is a complete
    probabilistic description of a random variable.
  • All other statistical concepts (expectation,
    variance, etc) are derived from it.
  • Once we know the probability distribution of a
    random variable, we know everything we can learn
    about it from statistics.

5
Probability Distribution
  • Probability function
  • One form the probability distribution of a
    discrete random variable may be expressed in.
  • Expresses the probability that X takes the value
    x as a function of x (as we saw before)

6
Probability Distribution
  • The probability function
  • May be tabular

7
Probability Distribution
  • The probability function
  • May be graphical

.50
.33
.17
1
2
3
8
Probability Distribution
  • The probability function
  • May be formulaic

9
Probability Distribution Fair die
10
Probability Distribution
  • The probability function, properties

11
Cumulative Probability Distribution
  • Cumulative probability distribution
  • The cdf is a function which describes the
    probability that a random variable does not
    exceed a value.

Yes!
Does this make sense for a continuous r.v.?
12
Cumulative Probability Distribution
  • Cumulative probability distribution
  • The relationship between the cdf and the
    probability function

13
Cumulative Probability Distribution
  • Die-throwing

tabular
graphical
14
Cumulative Probability Distribution
  • The cumulative distribution function
  • May be formulaic (die-throwing)

15
Cumulative Probability Distribution
  • The cdf, properties

16
Example CDFs
Of a discrete probability distribution Of a
continuous probability distribution Of a
distribution which has both a continuous part and
a discrete part.
17
Functions of a random variable
  • It is possible to calculate expectations and
    variances of functions of random variables

 
18
Functions of a random variable
  • Example
  • You are paid a number of dollars equal to the
    square root of the number of spots on a die.
  • What is a fair bet to get into this game?

 
19
Functions of a random variable
 
  • Linear functions
  • If a and b are constants and X is a random
    variable
  • It can be shown that

Intuitively, why does b not appear in
variance? And, why a2 ?
 
 
20
  • The Most Common
  • Discrete Probability Distributions
  • (some discussed before)

1) --- Bernoulli distribution 2) --- Binomial 3)
--- Geometric 4) --- Poisson
21
Bernoulli distribution
  • The Bernoulli distribution is the coin flip
    distribution.
  • X is Bernoulli if its probability function is

X1 is usually interpreted as a success.
E.g. X1 for heads in coin toss X1 for male in
survey X1 for defective in a test of product X1
for made the sale tracking performance
22
Bernoulli distribution
  • Expectation
  • Variance

23
Binomial distribution
  • The binomial distribution is just n independent
    Bernoullis
  • added up.
  • It is the number of successes in n trials.
  • If Z1, Z2, , Zn are Bernoulli, then X is
    binomial

24
Binomial distribution
  • The binomial distribution is just n independent
    Bernoullis
  • added up. Testing for defects with replacement.
  • Have many light bulbs
  • Pick one at random, test for defect, put it back
  • Pick one at random, test for defect, put it back
  • If there are many light bulbs, do not have to
    replace

25
Binomial distribution
  • Lets figure out a binomial r.v.s probability
    function.
  • Suppose we are looking at a binomial with n3.
  • We want P(X0)
  • Can happen one way 000
  • (1-p)(1-p)(1-p) (1-p)3
  • We want P(X1)
  • Can happen three ways 100, 010, 001
  • p(1-p)(1-p)(1-p)p(1-p)(1-p)(1-p)p 3p(1-p)2
  • We want P(X2)
  • Can happen three ways 110, 011, 101
  • pp(1-p)(1-p)ppp(1-p)p 3p2(1-p)
  • We want P(X3)
  • Can happen one way 111
  • ppp p3

26
Binomial distribution
  • So, binomial r.v.s probability function

27
Binomial distribution
  • Typical shape of binomial
  • Symmetric

28
  • Expectation

Variance
Aside
If V(X) V(Y). And?
But
Hmm
29
Binomial distribution
  • A salesman claims that he closes the deal 40 of
    the time.
  • This month, he closed 1 out of 10 deals.
  • How likely is it that he did 1/10 or worse given
    his claim?

30
Binomial distribution
Less than 5 or 1 in 20. So, its unlikely that
his success rate is 0.4.
Note
31
Binomial and normal / Gaussian distribution
The normal distribution is a good
approximation to the binomial distribution.
(large n, small skew.)
B(n, p)
Prob. density function
32
Geometric Distribution
  • A geometric distribution is usually interpreted
    as number of time periods until a failure occurs.
  • Imagine a sequence of coin flips, and the random
    variable X is the flip number on which the first
    tails occurs.
  • The probability of a head (a success) is p.

33
Geometric
  • Lets find the probability function for the
    geometric distribution

etc.
So,
(x is a positive integer)
34
Geometric
  • Notice, there is no upper limit on how large X
    can be
  • Lets check that these probabilities add to 1

Geometric series
35
Geometric
differentiate both sides w.r.t. p
See Rosen page 158, example 17.
  • Expectation

Variance
36
Poisson distribution
  • The Poisson distribution is typical of random
    variables which represent counts.
  • Number of murders in Ithaca next year.
  • Number of requests to a server in 1 hour.
  • Number of sick days in a year for an employee.

?!
37
  • The Poisson distribution is derived from the
    following underlying arrival time model
  • The probability of an unit arriving is uniform
    through time.
  • Two items never arrive at exactly the same time.
  • Arrivals are independent --- the arrival of one
    unit does not make the next unit more or less
    likely to arrive quickly.

38
Poisson distribution
  • The probability function for the Poisson
    distribution with parameter ? is
  • ? is like the arrival rate --- higher means
    more/faster arrivals

39
Poisson distribution
  • Shape

Low ?
Med ?
High ?
40
  • Markov and Chebyshev bounds

41
  • Often, you dont know the exact probability
    distribution
  • of a random variable.
  • We still would like to say something about the
    probabilities involving that random variable
  • E.g., what is the probability of X being larger
    (or smaller) than some given value.
  • We often can by bounding the probability of
    events based on partial information about the
    underlying probability distribution
  • Markov and Chebyshev bounds.

42
Theorem ? Markov Inequality
Note relates cumulative distribution to expected
value.
Let X be a nonnegative random variable with EX
?. Then, for any t gt 0,
Hmm. What if ?
Sure! ?
Cant have too much prob. to the right of EX
gives
But
43
Proof
I.e.
Where did we use X gt 0?
3rd line
44
Alt. proof ? Markov Inequality
Define
? EY ?EX
45
  • Example
  • Consider a system with mean time to failure 100
    hours.
  • Use the Markov inequality to bound the
    reliability of the system,
  • R(t) for t 90, 100, 110, 200

X time to failure of the system EX100
R(t) PXgtt , with t 90, 100, 110 , 200
By Markov
?
Markov inequality is somewhat crude, since only
the mean is assumed to be known.
46
Theorem ? Chebyshev's Inequality
  • Assume that mean and variance are given.
  • We can obtain a better estimate of probability of
  • events of interest by using Chebyshevs
    inequality

47
Theorem ? Chebyshev's Inequality
Proof
Markov Ineq. applied to r.v.
48
Chebyshev inequality Alternate forms
  • Yet two other forms of Chebyshevs ineqaulity

Says something about the probability of being k
standard deviations from the mean.
49
Theorem ? Chebyshev's Inequality
50
Theorem ? Chebyshev's Inequality
Facts
1-1/4 .75 1-1/9 .889 1-1/160.93
51
(No Transcript)
Write a Comment
User Comments (0)
About PowerShow.com