Title: Relational Hidden Markov Models
1Relational (Hidden) Markov Models
- Probabilistic Logic Learning Seminar
- Organizer Prof. Dr. Luc De Raedt
- Co-Organizer Dipl.-Inf. K. Kersting
- Julio César Pastrana
- 15-Jan-2004
2Overview
- Motivation Why the need for new Models?
- RMMs (Relational Markov Models)
- Related Work
- Conclusions
- Bibliography
3MotivationWhy the need for new models?
- HMMs handle sequences of unstructed symbols.
- Example
- Problems encountered when modeling problems
that involve structural properties such as
lenght. - The number of states rapidly explodes.
4Why the need for new models?
- Each state is trained independently
- Abundant training data at one state cannot
improve prediction at another state - Large state models ? vast training data
- Problem Web trace data is sparse
- A single visitor views 0 of any site
- New dynamic content not in training data
5Why the need for new models?
PRM,BLPs,SLPs
Bayes Net
DBN
MRDM, ILP
Structure
Relational
Sequence
(Hidden) Markov Model
6Overview
- Motivation Why the need for new Models?
- RMMs (Relational Markov Models)
- Related Work
- Conclusions
- Bibliography
7RMMs Relational Markov Models
- Definition
- RMMs are a generalization of Markov models where
states can be of different types, with each type
described by a different set of variables. - RMMs combine Markov Models and predicate
calculus.
8RMM Relational Markov Model
Q set of states
p initial probability vector
A transition probability matrix
R a set of relations
9E-commerce Site Markov Model
m505_backorder.html
main.html
checkout.html
iMac_instock.html
dell4100_instock.html
10RMM - Domain Hierarchies
Domain Products
Instance of relation with leaf values is a state,
e.g. ProductPage(iMac, in_stock)
11RMM - Domain Hierarchies
Instance of relation with non-leaf values is a
set of states an abstraction, eg
ProductPage(AllComputers, in_stock)
12RMM - Set of abstractions
- Define the set of abstractions of a given state q
R( . . .) to be a set of states such that each
argument to R in the abstraction is an ancestor
to the corresponding argument in q. - Formally
R The relation, Q set of all possible
instantiations of R d possible arguments, D
the tree of a particular type of argument, delta
the arguments to the predicate for the given
state q.
13RMM - Set of abstractions
14RMM - Learning and Infernce
- For each posible state abstraction ?
- initial probability
- For a pair (? , ?) the transition is define as
15RMM - Mixture model
- What if we want to estimate P(s ? d) but no
data! - We can do this with abstractions of d and s
- Let ? be an abstraction of s and ? of d
?
16RMM - Mixture model
- Estimating the transition probability between
ground states qs and qd
Where lambda is a mixing coefficient based on
alpha and beta in the range 0,1 such that the
sum of all lambdas is 1.
17RMM - Choosing Lambda
- A good choice will have two properties
- Gets higher as abstraction depth increases
- Gets lower as available training data decreases
Where n possible transitions from alpha to beta
in the data, k is a design parameter to penalize
lack of data, and rank(a) is
where each d is an argument to the relation
defining a.
18RMM - Probability Estimation Tree
- With deep hierarchies and lots of arguments,
considering every possible abstraction may not be
feasible - Learn a decision tree over possible abstractions
- To the left is the tree for the page
Product_page(mac, in_stock)
19RMM Experimental results
20Overview
- Motivation Why the need for new Models?
- RMMs (Relational Markov Models)
- Related Work
- Conclusions
- Bibliography
21Related Work
- LOHMMs Logical Hidden Markov Models
-
- The key idea underlying LOHMMs is to employ
- logical atoms as structured symbols.
22LOHMMs
- Motivations for using LOHMMs
- Variables in the nodes allow us to make
abstractions of specific simbols. - emacs( X, Gandalf )
23LOHMMs
- We can have Abstract transitions
- Where p ? 0,1 H, B and O are atoms, e.g.
24LOHMMs
- Unification is also allow in LOHMMs.
- Unification let us share informations among
hidden states and observations. - NOTE
- Unification is not allowed in RMMs.
- Hidden states are not defined.
25Overview
- Motivation Why the need for new Models?
- RMMs (Relational Markov Models)
- Related Work
- Conclusions
- Bibliography
26Conclusions
-
- Today we saw how we can overcome some
limitations of some of the models we reviwed in
the seminar. These new models take advantage of
relational structure of the training data to
improve prediction.
27Conclusions
DPRM
PRM,BLPs,SLPs
Bayes Net
DBN
MRDM, ILP
Structure
RMMs LOHMMs,
Relational
Sequence
(Hidden) Markov Model
28Bibliography
- Relational Markov Models and their Application
toAdaptive Web Navigation. - Corin R. Anderson, Pedro Domingos, Daniel S. Weld
- Towards Discovering Structural Signatures of
Protein Folds based on Logical Hidden Markov
Models. - Kristian Kersting, Tapani Raiko, Stefan Kramer,
Luc De Raedt - A Structural GEM for Learning Logical Hidden
Markov Models. - Kristian Kersting, Tapani Raiko, Luc De Raedt