Markov Chains - PowerPoint PPT Presentation

1 / 14
About This Presentation
Title:

Markov Chains

Description:

its transition matrix P and the initial probability vector a where ai is the ... wintails on my my ent, thinks, fore voyager lanated the been elsed helder was ... – PowerPoint PPT presentation

Number of Views:39
Avg rating:3.0/5.0
Slides: 15
Provided by: VasileiosH9
Category:
Tags: chains | fore | markov

less

Transcript and Presenter's Notes

Title: Markov Chains


1
Markov Chains
  • Vasileios Hatzivassiloglou
  • University of Texas at Dallas

2
Revisiting Markov models
  • A Markov chain is completely specified by
  • its transition matrix P and the initial
    probability vector a where ai is the probability
    of starting at state si
  • We can also assume a known initial state s0
  • Recall that the chain must be a stationary
    stochastic process and has limited memory

3
Using MCs for prediction
  • We have already seen an example of how MCs (or
    n-gram models) can be used to estimate the
    probability of a long sequence
  • Another application is trying to estimate the
    steady-state or stationary distribution, the
    probability that the system will be in a given
    state after many transitions

4
Calculating state probabilities
  • At start, given by the vector p0a
  • After one transition, given by the vector p1Pa
  • After n transitions, given by pnPna
  • The stationary distribution is
  • This limit exists if the MC is aperiodic and
    irreducible

5
Obtaining the stationary probability
  • We can always carry the previous multiplication
    to convergence
  • At convergence,
  • Pp p
  • Therefore, p is the (right) eigenvector of P
    associated with eigenvalue 1 (this always exists
    for ergodic MCs)

6
Eigenvectors
  • A matrix A corresponds to a linear transformation
    of vectors, x ? Ax
  • Certain vectors remain unchanged in direction
    under this transformation they are only
    multiplied by a constant (x ? cx)
  • These are the eigenvectors (or characteristic
    vectors) of the matrix the scaling constant is
    the corresponding eigenvalue

7
Eigenvectors of a transformation
8
Obtaining eigenvectors
  • The eigenvalues are the solutions ? of the
    equation det(A-?I)0
  • An nn matrix has at most n eigenvalues at least
    one is real if n is odd
  • Finding the exact solution is a hard problem.
    Usually it is approximated.
  • An easy approximation Take a random v, and
    calculate Av, A2v, A3v, ... This will almost
    always converge to an eigenvector.

9
The largest eigenvector in daily use
  • Googles PageRank
  • In information retrieval, we want to find
    relevant pages for the query that are also
    trusted pages
  • To approximate trust, measure how many other
    pages link to it (directly or indirectly)
  • Google maintains a MC transition matrix from each
    web page to each other web page, and calculates
    PageRank as the stationary distribution of this
    chain

10
Markov chains as generators
  • We have seen two applications of Markov Chains
    (estimation, steady state calculation)
  • We can also use them to simulate a stochastic
    process and generate a sequence of states similar
    to those the process would visit

11
Markov chains as generators
  • We have seen two applications of Markov Chains
    (estimation, steady state calculation)
  • We can also use them to simulate a stochastic
    process and generate a sequence of states similar
    to those the process would visit

12
Models for MC generation
  • Can be based on successive letters or successive
    words
  • Can model the likelihood of a word or sentence of
    a particular length as an additional component
  • Can incorporate additional grammatical and/or
    semantic constraints

13
Sample output Letter models
  • (Uniform) uzlpcbizdmddk njsdzyyvfgxbgjjgbtsak
    rqvpgnsbyputvqqdtmgltz ynqotqigexjumqphujcfwn ll
  • Order 0 saade ve mw hc n entt da k
    eethetocusosselalwo gx fgrsnoh,
  • Order 1 t I amy, vin. id wht omanly heay atuss n
    macon aresethe hired
  • Order 2 Ther I the heingoind of-pleat, blur it
    dwere wing waske hat trooss.
  • Order 3 I has them the saw the secorrow. And
    wintails on my my ent, thinks, fore voyager
    lanated the been elsed helder was of him a very
    free bottlemarkable,
  • Order 4 His heard." "Exactly he very glad
    trouble, and by Hopkins! That it on of the who
    difficentralia. He rushed likely?" "Blood night
    that.

14
Reading
  • Sections 6.1.2-6.1.3 on n-grams and corpus
    building
Write a Comment
User Comments (0)
About PowerShow.com