Markov chains Notations and Definitions - PowerPoint PPT Presentation

1 / 30
About This Presentation
Title:

Markov chains Notations and Definitions

Description:

In matrix terms, a class C is ergodic if the partial row sum ... A single absorbing state is an ergodic class. An irreducible chain consists of a single ergodic class. ... – PowerPoint PPT presentation

Number of Views:82
Avg rating:3.0/5.0
Slides: 31
Provided by: mrt4
Category:

less

Transcript and Presenter's Notes

Title: Markov chains Notations and Definitions


1
Markov chains Notations and Definitions
  • Stochastic Processes
  • By
  • TMJA Cooray

2
Notations and Definitions
  • Consider a sequence of trials numbered n0,1,..
    The outcome of the nth trial is represented by
    the random variable Xn, which is assumed to be
    discrete and to take one of the values j1,2,.
  • The actual set of outcomes at any trial is a
    system of events, Ei , i1,2,. That are mutually
    exclusive and exhaustive.They are the states of
    the system and they may be finite or infinite in
    number.

3
  • Write the absolute probability of outcome Ej at
    the nth trial as
  • PrXnj pj(n) ----------(1)
  • Where the initial distribution is given by
  • If Xn-1i and Xnj ,
  • then we say that the system has made
  • a transition of type
  • Ei ? Ej at the nth trial or step.
  • We are interested in knowing the probability of
    occurrences of these transitions .

PrX0j pj(0)
4
  • If the trials are not independent , we have to
    specify the conditional probability.
  • PrXnj Xn-1i,Xn-2k, X0l ---------(2)
  • With independent trials (1) and (2) may be
    identical.
  • According to the Markovian property,the behavior
    of the sequence of events is uniquely decided by
    the previous state .
  • Thus the transition probability depends only on
    Xn-1 and not on previous r.v.s.

5
  • Therefore we can define a Markov chain as a
    sequence of consecutive trials such that
  • PrXnj Xn-1i,Xn-2k, X0l PrXnj Xn-1i


  • --------(3)
  • We now have an important class of chains defined
    by the property in (3),for which the transition
    probabilities are independent of n.(time
    homogeneous)

6
  • We can then write a homogeneous Markov chain ,
  • PrXnj Xn-1i pij
    --------(4)
  • Order of the subscript ij corresponds to the
    direction of the transition i?j.
  • We also have

  • -----(5)
  • For any fixed i, the pij will form a probability
    distribution .

7
  • In real situation ,
  • we will be given the initial distribution and
    the transition probabilities (one step),and we
    want to determine the probability distribution
    for each random variable Xn.
  • We may also be interested in the limiting
    distribution of the r.v.
  • Xn as n? ? ,if it exists.

8
Transition matrix
  • Suppose there are k states of the system.
    Transition probabilities can be easily
    represented in matrix form .
  • Ppij or PPijT, can be finite or infinite
    in order depending on the number of states
    involved.
  • As P

To state n1
From state n
9
  • In the transition matrix
  • if all pij ?0 and
  • row sum (or column sum ) is unity
  • then it is called a stochastic matrix.
  • In order to determine the absolute
    probabilities at any stage ,the n step transition
    probabilities should be calculated, where ngt1.
  • PrXnmj Xmi pij (n)
    ----------------(7)
  • With pij (1) pij
  • n step transition probability pij (n) is
    independent of m.

10
  • Let p(0) be the initial pr.distribution.
  • The distribution at the first stage is given by
  • In matrix terms we can express (8) as
  • p(1) P p(0) ------------(9)
  • Similarly p(2) P p(1) P P p(0) P2 p(0) ---(10)
  • In general p(n) Pn p(0) ------------(11)

11
  • We also have p(nm) Pnm p(0)
  • Pn Pm p(0)
  • Pn p(m)
    ------------(12)
  • For all states i and j and all integers mgt0 and
    n0
  • This is the Chapman -Kolmogorov equation

12
  • Absolute probabilities at any stage can be
    obtained by the initial distribution and the
    relevant step transition matrix

13
Classification of states
  • Lemma suppose a Markov chain has N states.Let i
    and j be pair of states .Then (j can be reached
    from i) i?j iff theres an integer 0nltN such
    that the i,j entry of Pn is positive. That is pij
    n is gt0
  • Definition states i and j communicate if i?j
    and j?i.
  • We write i??j
  • Thus theres at least one path in the transition
    diagram from i to j and vice versa.

14
  • Definition A Markov chain is irreducible if
    every state communicates with every other state.
  • For each state i, let class of the state i is
  • C(i)all states j such that i ??j
  • Properties of classes or closed sets
  • (1) i C(i) for every state i.
  • (2)If j C(i) then i C(j) .
  • (3) for any two states i and j either C(i)C(j)
    or C(i) is disjoint from C(j).

15
  • Definition State i is called absorbing if pii1.

16
  • If the state is persistent and the mean
    recurrence time is infinity then the state is a
    null state.
  • If the state is persistent and the mean
    recurrence time is finite then the state is a
    positive recurrent.

17
  • A state is periodic with period d gt1, if every
    path that starts and ends at this state has
    length nd, where n can be any integer.0,1,2,

18
  • Lemma
  • Suppose i is periodic and i??j then j is periodic
    and has the same period .
  • A state is aperiodic if it is not periodic and
    d1.
  • A finite Markov chain is called regular if it
    is irreducible and aperiodic.

19
  • If a state is persistent, aperiodic and not null
  • then it is said to be ergodic.

20
  • Definition A class C is called ergodic if
    every path that starts in C remains in C.
  • In matrix terms, a class C is ergodic if the
    partial row sum
  • The individual states in an ergodic class are
    also called ergodic.
  • A single absorbing state is an ergodic class. An
    irreducible chain consists of a single ergodic
    class.

21
Transient classes
  • Definition A class C is called transient if
    there is a path out of C.
  • In matrix terms, a class C is transient if state
    i in C and state k not in C so that i?k (pikgt0)
    then the partial row sum is

22
Stationary distribution
  • Definition ? is a stationary distribution for
    the Markov chain with transition matrix P, if
    ?P ? and in general if
  • ?Pn ? for all integers ngt0
  • Theorem Suppose that P is the transition matrix
    for a regular chain ,and that ? is the limiting
    vector for P,
  • Then ? is the unique stationary distribution for
    the chain a Pn converges to ? as n?? where a
    is the initial pr.distribution .

23
Example
  • Gamblers ruin problem ?

24
  • Consider two gamblers, A having Rs.2/ and B
    having Rs.3/. Probability of A winning (Rs.1)
    the game at any trial be p and losing (Rs.1) the
    game be q.what are the possible states?
  • A state can be defined as (x,y), where x,y are
    the amounts possessed by each player.

A state can be defined as (x,y), where x,y are
the amounts possessed by each player. Or by the
amount possessed by player A
Initially , at t0 ,state is (2,3) or state is
2.
25
(No Transcript)
26
  • Initial distribution is

27
Standard form for transition matrix
  • When a Markov chain is not irreducible( includes
    more than one class of states),the powers of the
    transition matrix are easier to analyze if we
    group the states in to classes.
  • list the ergodic classes before the transient
    classes.

28
(No Transcript)
29
(No Transcript)
30
Powers of P can be easily found.
Write a Comment
User Comments (0)
About PowerShow.com