Relative Entropy Part I - PowerPoint PPT Presentation

1 / 7
About This Presentation
Title:

Relative Entropy Part I

Description:

Earlier context (Sue swallowed) may suggest pill. Depends on number of symbols, available training material. 3. Estimating entropy with MCs ... – PowerPoint PPT presentation

Number of Views:26
Avg rating:3.0/5.0
Slides: 8
Provided by: VasileiosH9
Category:

less

Transcript and Presenter's Notes

Title: Relative Entropy Part I


1
Relative EntropyPart I
  • Vasileios Hatzivassiloglou
  • University of Texas at Dallas

2
Estimating entropy with MCs
  • Choose appropriate order k
  • Sue swallowed the large green
  • How can we estimate what follows?
  • Immediate context (large green) may suggest tree
  • Earlier context (Sue swallowed) may suggest pill
  • Depends on number of symbols, available training
    material

3
Estimating entropy with MCs
  • From training data, estimate transition
    probabilities
  • From separate test data, calculate the joint
    probability of a long sequence using the chain
    rule and the Markov assumption
  • Smoothing is necessary
  • to avoid zero probability for the test data

4
Cross entropy
  • Markov chains (or any model M for a probability
    distribution p) calculate the probability
    m(X1,X2,...,Xn) rather than the original
    probability p(X1,X2,...,Xn)
  • So what we are calculating is
  • which is the cross entropy between X and m

5
How far is cross entropy from the true entropy?
  • Depends on how good the model is, i.e., how close
    m(X1,X2,...,Xn) is to p(X1,X2,...,Xn)
  • The difference H(X m)-H(X) is

6
Relative Entropy
  • This last quantity
  • is known as relative entropy or Kullback-Leibler
    distance or Kullback-Leibler divergence

7
Reading
  • Section 9.1 on Markov chains
  • Sections 2.2.5-2.2.6 on relative entropy and
    cross entropy
Write a Comment
User Comments (0)
About PowerShow.com