010'141 Engineering Mathematics II Lecture 4 Continuous Distributions - PowerPoint PPT Presentation

1 / 28
About This Presentation
Title:

010'141 Engineering Mathematics II Lecture 4 Continuous Distributions

Description:

Bob McKay. School of Computer Science and Engineering. College of Engineering ... Continuous Example (Ross) ... Distribution Example (Ross) Consider a bank ... – PowerPoint PPT presentation

Number of Views:30
Avg rating:3.0/5.0
Slides: 29
Provided by: scSn
Category:

less

Transcript and Presenter's Notes

Title: 010'141 Engineering Mathematics II Lecture 4 Continuous Distributions


1
010.141 Engineering Mathematics IILecture
4Continuous Distributions
  • Bob McKay
  • School of Computer Science and Engineering
  • College of Engineering
  • Seoul National University
  • Partly based on
  • Sheldon Ross A First Course in Probability

2
Outline
  • Continuous Random Variables
  • Continuous Distributions
  • Uniform
  • Normal
  • Exponential
  • Other Discrete Distributions
  • Gamma
  • Weibull
  • Cauchy
  • Beta
  • Functions of a Random Variable

3
Continuous Random Variables
  • Discrete random variables cover many important
    problems, but there are also many problems which
    cannot be conveniently represented in this way
  • Recall that a random variable over a probability
    space (?, ?,P) is a measurable function X ? ? S
    for some set S
  • If ? is a continuous space - for example R, the
    real numbers, we call X a continuous random
    variable
  • In the case of R, this is equivalent to an
    alternative definition, that there exists a
    non-negative function f defined over R, such that
    for any measurable B ? R,
  • PX ?B ?B f(x) dx
  • f is known as the probability density function

4
Properties ofContinuous Random Variables
  • Of course, for the case B R
  • ?R f(x) dx PX ? R 1
  • For intervals of R, we have the equivalent
  • PX ? a,b ?ab f(x) dx
  • So PX ? a,a ?aa f(x) dx 0
  • That is, the probability of any specific value is
    zero
  • PX lt a ?-?a f(x) dx

5
Continuous Example (Ross)
  • The amount of time, in hours, that a computer
    functions before breaking down is a continuous
    random variable with probability density function
    given by
  • f(x) ?e-x/100 x ? 0
  • 0 x lt 0
  • What is the probability that a computer will
    function between 50 and 150 hours before breaking
    down?

6
Continuous Example
  • We have
  • ?-?? f(x) dx ? ?0? e-x/100 dx 1
  • From which
  • ? 1/100
  • So
  • PX ? 50,150 ?50150 1/100 e-x/100 dx
    e-1/2 - e-3/2 0.633

7
Uniform Distribution
  • The probability density function
  • f(x) 1 0 lt x lt 1 0 otherwise
  • generates a uniform distribution over the
    interval (0,1)
  • That is, any value between 0 and 1 is equally
    likely
  • This may be generalised to an arbitrary interval
    (a,b)
  • f(x) 1/b-a a lt x lt b 0 otherwise

8
Normal Distribution
  • The probability density function
  • f(x) 1/?(2??) e-(x-?)2/2?2
  • generates a normal distribution with mean ? and
    standard deviation ?
  • If X is normally distributed with mean ? and
    standard deviation ?, then Y ?X? is normally
    distributed with mean ??? and standard deviation
    ??
  • In particular, setting Z (X - ?)/? generates
    the unit normal distribution

9
Properties of the Normal Distribution
  • The cumulative distribution function
  • ?(x) 1/?(2?) ?-?x e-y2/2 dy
  • Satisfies the symmetry condition
  • ?(-x) 1 - ?(x)
  • The distribution function for a normally
    distributed random variable with mean ? and
    standard deviation ? can be written
  • Fx(a) ?((a - ?) / ?)
  • ?(x) 1 - 1/x?(2?) e-x2/2 for large x

10
Normal Distribution Example (Ross)
  • An instructor gives an A to those whose test
    score is greater than ??, B for between ? and
    ??, C for ?-? to ?, D for ?-2? to ?-?, and F for
    less than ?-2?
  • What is the probability of an A? Of an F?
  • Probability of A 1 - ?(1) 0.1587
  • Probability of F ?(-2) 0.0288

11
Normal and Binomial Distributions
  • If Sn denotes the number of successes that occur
    when n independent trials, each resulting in a
    success with probability p, are performed then
    for any a lt b
  • P a ? (Sn-np) / ?(np(1-p)) ? b ? ?(b) - ?(a)
  • as n ? ?
  • Thus we have two good approximations to the
    binomial distribution - the Poisson approximation
    for when np is moderate, and the Normal for when
    np(1-p) is large

12
Normal / Binomial Example
  • To determine the effectiveness of a certain diet
    in reducing the amount of cholesterol in the
    blood stream, 100 people are put on the diet.
    After they have been on the diet for a sufficient
    time, their cholesterol count will be taken. The
    nutritionist running the experiment has decided
    to endorse the diet if at least 65 of the people
    have a lower cholesterol count after the diet.
    What is the probability that the nutritionist
    endorses the new diet if, in fact, it has no
    effect on the cholesterol level?

13
Normal / Binomial Example
  • We assume that each persons probability of a
    random reduction is 0.5, independent of any other
    person
  • Thus we want to compute the probability that,
    given only random reductions, more than 65
    successes are seen
  • PX ? 65
  • Transforming as required, this is
  • P 3 ? (X-(1000.5) / ?(1000.50.5)
  • Which we can approximate as
  • 1 - ?(3) 0.0013
  • Since n 100 is sufficiently large

14
Normal / Binomial Example cont.
  • Actually, we should allow for the fact that a
    result of 64.8 will be measured as 65, so what we
    really want is
  • PX ? 64.5
  • Transforming as required, this is
  • P 2.9 ? (X-(1000.5) / ?(1000.50.5)
  • Which we can approximate as
  • 1 - ?(2.9) 0.0019
  • Since n 100 is still sufficiently large

15
Exponential Distribution
  • The probability density function
  • f(x) ?e-?x x ? 0
  • 0 x lt 0
  • generates an exponential distribution with
    parameter ?
  • The cumulative distribution function may be found
    by integration as
  • F(a) 1 - e-?x
  • The exponential distribution arises frequently as
    the time until some event occurs

16
Exponential Distribution Properties
  • We define a random variable X to be memoryless if
  • P X gt st X gt t P X gt s
  • (that is, if you have already waited until t, the
    probability that you will have to wait a further
    s is the same as the initial probability that you
    will have to wait s
  • The exponential distribution is the only
    memoryless distribution

17
Exponential Distribution Example (Ross)
  • Consider a bank with two tellers. Suppose that
    when Mr Smith enters the bank, he discovers that
    Ms Jones is being served by one of the tellers,
    and Mr Brown by the other
  • Suppose also that Mr Smith is told that his
    service will begin as soon as either Jones or
    Brown leaves
  • If the amount of time spent with a teller is
    exponentially distributed with parameter ?, what
    is the probability that, of the three customers,
    Mr smith is the last to leave?

18
Exponential Distribution Example (Ross)
  • Clearly, one of Brown and Jones must leave first,
    at which time Smith starts to be served
  • At that point, since the exponential distribution
    is memoryless, both Smith and the remaining
    customer have the same distribution of
    probability of length of service
  • Thus Smiths probability of being the last to
    leave is 0.5
  • Since Jones and Brown have equal distributions
    when Smith arrives, they each have probability
    0.25 of being last to leave

19
Gamma Distribution
  • The gamma distribution with parameters (t, ?) is
    given by the probability density function
  • f(x) ?e-?x(?x)t-1 / ?(t) x ? 0
  • 0 x lt 0
  • where ? is defined as
  • ?(t) ?0? e-yyt-1dy
  • The gamma distribution arises frequently as the
    time until n occurrences of some event occur
  • The special case in which ? 0.5 and tn/2 is
    known as the ?n2 distribution, and arises
    particularly as the error distribution of an n
    dimensional problem, where the error in each
    coordinate is normally distributed

20
Weibull Distribution
  • The Weibull distribution with parameters (?, ?,
    ?) is given by the probability density function
  • f(x) (?/?)??-1e-?? x gt ?
  • 0 x? ?
  • where ? is defined as
  • ? (x - ?) / ?
  • Its cumulative density distribution is
  • f(x) 1 - e-?? x gt ?
  • 0 x? ?

21
Weibull Distribution
  • The Weibull distribution is used to approximate
    the failure of items composed of many parts,
    where failure of any of the parts causes the
    whole assembly to fail
  • For example, a computer fails when any of its
    components fail
  • Hard drive
  • Memory
  • Cpu
  • Bus
  • Cache
  • Etc.

22
Cauchy Distribution
  • The Cauchy distribution with parameters ? is
    given by the probability density function
  • f(x) 1 / (? 1 (x - ?)2
  • It has a similar shape to the normal
    distribution, but with fatter tails
  • Values far from the mean are more likely to be
    found than with the normal distribution

23
Beta Distribution
  • The Beta distribution with parameters (a, b) is
    given by the probability density function
  • f(x) 1/B(a,b) xa-1(1-x)b-1 0 lt x 1
  • 0 otherwise
  • where B is defined as
  • B(a,b) ?01 xa-1(1-x)b-1dx

24
Beta Distribution
  • The Beta distribution gives controllable
    distributions
  • representing prior knowledge about the shape of
    solutions
  • b controls the symmetry of the distribution
  • When b lt a, the distribution is skewed to the
    left
  • I.e. small values are more likely
  • When b a, the distribution is symmetric
  • When b gt a, the distribution is skewed to the
    right
  • a controls the general shape of the distribution
    for a b
  • When a 1, we get a uniform distribution
  • When a lt 1, the distribution is concave upwards
  • When a gt1, the distribution is convex upwards

25
Functions of a Random Variable
  • We often know about the relationship between two
    random variables
  • In particular, for two random variables X and Y,
    we may know that Y g(X)
  • If we also know the probability density function
    of X, we would like to know the provability
    density function of Y
  • To know about the probability density of Y at a
    particular value y, we will have to work from the
    probability density of the x that gives rise to
    y, ie of g-1(y)
  • (for this to be well-defined, we will need to
    restrict the form of g)

26
Functions of a Random Variable
  • Let X be a continuous random variable with
    probability density function fX. Suppose that
    g(x) is a strictly increasing (or strictly
    decreasing) function of x. Then the variable Y
    defined by Y g(X) has a probability density
    function fY given by
  • fY(y) fXg-1(y) d/dy g-1(y) when
    g-1(y) is defined 0 when g-1(y) is
    undefined

27
Summary
  • Continuous Random Variables
  • Continuous Distributions
  • Uniform
  • Normal
  • Exponential
  • Other Discrete Distributions
  • Gamma
  • Weibull
  • Cauchy
  • Beta
  • Functions of a Random Variable

28
?????
Write a Comment
User Comments (0)
About PowerShow.com