Markov Processes and Birth-Death Processes - PowerPoint PPT Presentation

1 / 37
About This Presentation
Title:

Markov Processes and Birth-Death Processes

Description:

Markov Processes and Birth-Death Processes J. M. Akinpelu Exponential Distribution Definition. A continuous random variable X has an exponential distribution with ... – PowerPoint PPT presentation

Number of Views:319
Avg rating:3.0/5.0
Slides: 38
Provided by: Jacki212
Category:

less

Transcript and Presenter's Notes

Title: Markov Processes and Birth-Death Processes


1
Markov ProcessesandBirth-Death Processes
  • J. M. Akinpelu

2
Exponential Distribution
  • Definition. A continuous random variable X has an
    exponential distribution with parameter ? gt 0 if
    its probability density function is given by
  • Its distribution function is given by

3
Exponential Distribution
  • Theorem 1. A continuous R.V. X is exponentially
    distributed if and only if for
  • or equivalently,

  • A random variable with this property is said to
    be memoryless.

4
Exponential Distribution
  • Proof If X is exponentially distributed, (1)
    follows readily. Now assume (1). Define F(x)
    PX x, f (x) F?(x),
    and, G(x) PX gt
    x. It follows that G?(x) ? f (x). Now fix x.
    For h ? 0,
  • This implies that, taking the derivative wrt x,

5
Exponential Distribution
  • Letting x 0 and integrating both sides from 0
    to t gives

6
Exponential Distribution
  • Theorem 2. A R.V. X is exponentially distributed
    if and only if for h ? 0,

7
Exponential Distribution
  • Proof Let X be exponentially distributed, then
    for h ? 0,
  • The converse is left as an exercise.

8
Exponential Distribution
slope (rate) ?
9
Markov Process
  • A continuous time stochastic process Xt, t ? 0
    with state space E is called a Markov
    process provided that
  • for all states i, j ? E and all s, t ? 0.

known
st
s
0
10
Markov Process
  • We restrict ourselves to Markov processes for
    which the state space E 0, 1, 2, , and such
    that the conditional probabilities
  • are independent of s. Such a Markov process is
    called time-homogeneous.
  • Pij(t) is called the transition function of the
    Markov process X.

11
Markov Process - Example
  • Let X be a Markov process with
  • where
  • for some ? gt 0. X is a Poisson process.

0
12
Chapman-Kolmogorov Equations
  • Theorem 3. For i, j ? E, t, s ? 0,

13
Realization of a Markov Process
14
Time Spent in a State
  • Theorem 4. Let t ? 0, and n satisfy Tn t lt
    Tn1, and let Wt Tn1 t. Let i ? E, u ? 0,
    and define
  • Then
  • Note This implies that the distribution of time
    remaining in a state is exponentially
    distributed, regardless of the time already spent
    in that state.

15
Time Spent in a State
  • Proof We first note that due to the time
    homogeneity of X, G(u) is independent of t. If we
    fix i, then we have

16
An Alternative Characterization of a Markov
Process
  • Theorem 5. Let X Xt, t ? 0 be a Markov
    process. Let T0, T1, , be the successive state
    transition times and let S0, S1, , be the
    successive states visited by X. There exists some
    number ?i such that for any non-negative integer
    n, for any j ? E, and t gt 0,
  • where

17
An Alternative Characterization of a Markov
Process
  • This implies that the successive states visited
    by a Markov process form a Markov chain with
    transition matrix Q.
  • A Markov process is irreducible recurrent if its
    underlying Markov chain is irreducible recurrent.

18
Kolmogorov Equations
  • Theorem 6.
  • and, under suitable regularity conditions,
  • These are Kolmogorovs Backward and Forward
    Equations.

19
Kolmogorov Equations
  • Proof (Forward Equation) For t, h ? 0,
  • Hence
  • Taking the limit as h ? 0, we get our result.

20
Limiting Probabilities
  • Theorem 7. If a Markov process is irreducible
    recurrent, then limiting probabilities
  • exist independent of i, and satisfy
  • for all j. These are referred to as balance
    equations. Together with the condition
  • they uniquely determine the limiting distribution.

21
Birth-Death Processes
  • Definition. A birth-death process X(t), t ? 0
    is a Markov process such that, if the process is
    in state j, then the only transitions allowed are
    to state j 1 or to state j 1 (if j gt 0).
  • It follows that there exist non-negative values
    ?j and ?j,
  • j 0, 1, 2, , (called the birth rates and death
    rates) so that,

22
Birth and Death Rates
  • Note
  • The expected time in state j before entering
    state j1 is 1/?j the expected time in state j
    before entering state j?1 is 1/?j.
  • The rate corresponding to state j is vj ?j ?j.

23
Differential-Difference Equations for a
Birth-Death Process
  • It follows that, if
    , then
  • Together with the state distribution at time 0,
    this completely describes the behavior of the
    birth-death process.

24
Birth-Death Processes - Example
  • Pure birth process with constant birth rate
  • ?j ? gt 0, ?j 0 for all j. Assume that
  • Then solving the difference-differential
    equations for this process gives

25
Birth-Death Processes - Example
  • Pure death process with proportional death rate
  • ?j 0 for all j, ?j j? gt 0 for 1 j N, ?j
    0 otherwise, and
  • Then solving the difference-differential
    equations for this process gives

26
Limiting Probabilities
  • Now assume that limiting probabilities Pj exist.
    They must satisfy
  • or

27
Limiting Probabilities
  • These are the balance equations for a birth-death
    process. Together with the condition
  • they uniquely define the limiting probabilities.

28
Limiting Probabilities
  • From (), one can prove by induction that

29
When Do Limiting Probabilities Exist?
  • Define
  • It is easy to show that
  • if S lt ?. (This is equivalent to the condition P0
    gt 0.) Furthermore, all of the states are
    recurrent positive, i.e., ergodic. If S ?, then
    either all of the states are recurrent null or
    all of the states are transient, and limiting
    probabilities do not exist.

30
Flow Balance Method
  • Draw a closed boundary around state j
  • flow in flow out

Global balance equation
31
Flow Balance Method
  • Draw a closed boundary between state j and state
    j1

?j-1
?j
j1
j
j-1
?j1
?j
Detailed balance equation
32
Example
  • Machine repair problem. Suppose there are m
    machines serviced by one repairman. Each machine
    runs without failure, independent of all others,
    an exponential time with mean 1/?. When it fails,
    it waits until the repairman can come to repair
    it, and the repair itself takes an exponentially
    distributed amount of time with mean 1/?. Once
    repaired, the machine is as good as new.
  • What is the probability that j machines are
    failed?

33
Example
  • Let Pj be the steady-state probability of j
    failed machines.

34
Example
35
Example
  • How would this example change if there were m (or
    more) repairmen?

36
Homework
  • No homework this week due to test next week.

37
References
  1. Erhan Cinlar, Introduction to Stochastic
    Processes, Prentice-Hall, Inc., 1975.
  2. Leonard Kleinrock, Queueing Systems, Volume I
    Theory, John Wiley Sons, 1975.
  3. Sheldon M. Ross, Introduction to Probability
    Models, Ninth Edition, Elsevier Inc., 2007.
Write a Comment
User Comments (0)
About PowerShow.com