Relational Hidden Markov Models - PowerPoint PPT Presentation

1 / 28
About This Presentation
Title:

Relational Hidden Markov Models

Description:

Relational (Hidden) Markov Models. Probabilistic Logic Learning Seminar ... emacs( X, Gandalf ) LOHMMs. We can have Abstract transitions ... – PowerPoint PPT presentation

Number of Views:146
Avg rating:3.0/5.0
Slides: 29
Provided by: try2
Category:

less

Transcript and Presenter's Notes

Title: Relational Hidden Markov Models


1
Relational (Hidden) Markov Models
  • Probabilistic Logic Learning Seminar
  • Organizer Prof. Dr. Luc De Raedt
  • Co-Organizer Dipl.-Inf. K. Kersting
  • Julio César Pastrana
  • 15-Jan-2004

2
Overview
  • Motivation Why the need for new Models?
  • RMMs (Relational Markov Models)
  • Related Work
  • Conclusions
  • Bibliography

3
MotivationWhy the need for new models?
  • HMMs handle sequences of unstructed symbols.
  • Example
  • Problems encountered when modeling problems
    that involve structural properties such as
    lenght.
  • The number of states rapidly explodes.

4
Why the need for new models?
  • Each state is trained independently
  • Abundant training data at one state cannot
    improve prediction at another state
  • Large state models ? vast training data
  • Problem Web trace data is sparse
  • A single visitor views 0 of any site
  • New dynamic content not in training data

5
Why the need for new models?
PRM,BLPs,SLPs
Bayes Net
DBN
MRDM, ILP
Structure
Relational
Sequence
(Hidden) Markov Model
6
Overview
  • Motivation Why the need for new Models?
  • RMMs (Relational Markov Models)
  • Related Work
  • Conclusions
  • Bibliography

7
RMMs Relational Markov Models
  • Definition
  • RMMs are a generalization of Markov models where
    states can be of different types, with each type
    described by a different set of variables.
  • RMMs combine Markov Models and predicate
    calculus.

8
RMM Relational Markov Model
Q set of states
p initial probability vector
A transition probability matrix
R a set of relations
9
E-commerce Site Markov Model
m505_backorder.html
main.html
checkout.html
iMac_instock.html
dell4100_instock.html
10
RMM - Domain Hierarchies
Domain Products
Instance of relation with leaf values is a state,
e.g. ProductPage(iMac, in_stock)
11
RMM - Domain Hierarchies
Instance of relation with non-leaf values is a
set of states an abstraction, eg
ProductPage(AllComputers, in_stock)
12
RMM - Set of abstractions
  • Define the set of abstractions of a given state q
    R( . . .) to be a set of states such that each
    argument to R in the abstraction is an ancestor
    to the corresponding argument in q.
  • Formally

R The relation, Q set of all possible
instantiations of R d possible arguments, D
the tree of a particular type of argument, delta
the arguments to the predicate for the given
state q.
13
RMM - Set of abstractions
14
RMM - Learning and Infernce
  • For each posible state abstraction ?
  • initial probability
  • For a pair (? , ?) the transition is define as

15
RMM - Mixture model
  • What if we want to estimate P(s ? d) but no
    data!
  • We can do this with abstractions of d and s
  • Let ? be an abstraction of s and ? of d

?
16
RMM - Mixture model
  • Estimating the transition probability between
    ground states qs and qd

Where lambda is a mixing coefficient based on
alpha and beta in the range 0,1 such that the
sum of all lambdas is 1.
17
RMM - Choosing Lambda
  • A good choice will have two properties
  • Gets higher as abstraction depth increases
  • Gets lower as available training data decreases

Where n possible transitions from alpha to beta
in the data, k is a design parameter to penalize
lack of data, and rank(a) is
where each d is an argument to the relation
defining a.
18
RMM - Probability Estimation Tree
  • With deep hierarchies and lots of arguments,
    considering every possible abstraction may not be
    feasible
  • Learn a decision tree over possible abstractions
  • To the left is the tree for the page
    Product_page(mac, in_stock)

19
RMM Experimental results
20
Overview
  • Motivation Why the need for new Models?
  • RMMs (Relational Markov Models)
  • Related Work
  • Conclusions
  • Bibliography

21
Related Work
  • LOHMMs Logical Hidden Markov Models
  • The key idea underlying LOHMMs is to employ
  • logical atoms as structured symbols.

22
LOHMMs
  • Motivations for using LOHMMs
  • Variables in the nodes allow us to make
    abstractions of specific simbols.
  • emacs( X, Gandalf )

23
LOHMMs
  • We can have Abstract transitions
  • Where p ? 0,1 H, B and O are atoms, e.g.

24
LOHMMs
  • Unification is also allow in LOHMMs.
  • Unification let us share informations among
    hidden states and observations.
  • NOTE
  • Unification is not allowed in RMMs.
  • Hidden states are not defined.

25
Overview
  • Motivation Why the need for new Models?
  • RMMs (Relational Markov Models)
  • Related Work
  • Conclusions
  • Bibliography

26
Conclusions
  • Today we saw how we can overcome some
    limitations of some of the models we reviwed in
    the seminar. These new models take advantage of
    relational structure of the training data to
    improve prediction.

27
Conclusions
DPRM
PRM,BLPs,SLPs
Bayes Net
DBN
MRDM, ILP
Structure
RMMs LOHMMs,
Relational
Sequence
(Hidden) Markov Model
28
Bibliography
  • Relational Markov Models and their Application
    toAdaptive Web Navigation.
  • Corin R. Anderson, Pedro Domingos, Daniel S. Weld
  • Towards Discovering Structural Signatures of
    Protein Folds based on Logical Hidden Markov
    Models.
  • Kristian Kersting, Tapani Raiko, Stefan Kramer,
    Luc De Raedt
  • A Structural GEM for Learning Logical Hidden
    Markov Models.
  • Kristian Kersting, Tapani Raiko, Luc De Raedt
Write a Comment
User Comments (0)
About PowerShow.com