Markov Chains - PowerPoint PPT Presentation

1 / 14
About This Presentation
Title:

Markov Chains

Description:

... 0.7; and if it does not rain today, then it will not rain tomorrow with prob 0.6. ... The probability that the chain is in state i after n steps is the ith ... – PowerPoint PPT presentation

Number of Views:1001
Avg rating:3.0/5.0
Slides: 15
Provided by: susa61
Category:
Tags: chains | markov | rain

less

Transcript and Presenter's Notes

Title: Markov Chains


1
Tutorial 8
  • Markov Chains

2
Markov Chains
  • Consider a sequence of random variables X0, X1,
    , and the set of possible values of these random
    variables is 0, 1, , M.
  • Xn the state of some system at time n
  • Xn i
  • ? the system is in state i at time n

3
Markov Chains
  • X0, X1, form a Markov Chain if
  • Pij transition prob.
  • prob. that the system is in state i
    and it will next be in state j

4
Transition Matrix
  • Transition prob., Pij
  • Transition matrix, P

5
Example 1
  • Suppose that whether or not it rains tomorrow
    depends on previous weather conditions only
    through whether or not it is raining today.
  • If it rain today, then it will rain tomorrow
    with prob 0.7 and if it does not rain today,
    then it will not rain tomorrow with prob 0.6.

6
Example 1
  • Let state 0 be the rainy day
  • state 1 be the sunny day
  • The above is a two-state Markov chain having
    transition probability matrix,

7
Transition matrix
  • The probability that the chain is in state i
    after n steps is the ith entry in the vector
  • where
  • P transition matrix of a Markov chain
  • u probability vector representing the
    starting distribution.

8
Ergodic Markov Chains
  • A Markov chain is called an ergodic chain
    (irreducible chain) if it is possible to go from
    every state to every state (not necessarily in
    one move).
  • A Markov chain is called a regular chain if some
    power of the transition matrix has only positive
    elements.

9
Regular Markov Chains
  • For a regular Markov chain with transition
    matrix, P and ,
  • ith entry in the vector ? is the long run
    probability of state i.

10
Example 2
  • From example 1,
  • the transition matrix
  • The long run prob. for rainy day is 4/7.

11
Markov chain with absorption state
  • Example
  • Calculate
  • (i) the expect time to absorption
  • (ii) the absorption prob.

12
MC with absorption state
  • First rewrite the transition matrix to
  • N(I-Q)-1 is called a fundamental matrix for P
  • Entries of N,
  • n ij E(time in transient state jstart at
    transient state i)

13
MC with absorption state
  • (i) E(time to absorb start at i)

14
MC with absorption state
  • (ii) Absorption prob. BNR
  • bij P( absorbed in absorption state j
  • start at transient state i)
Write a Comment
User Comments (0)
About PowerShow.com