Title: Bayes Nets
1Bayes Nets
2Hidden Markov Model
- Inferring from observations (oi) to hidden
variables (qi) - This is a general framework for representing and
reasoning about uncertainty - Representing uncertain information with random
variables (nodes) - Representing the relationship between information
with conditional probability distribution
(directed arcs) - Infer from observation (shadowed nodes) to the
hidden variables (circled nodes)
3An Example of Bayes Network
- S It is sunny
- L Ali arrives slightly late
- O Slides are put on web late
4Bayes Network Example
Absence of an arrow Random S and O are
independent. Knowing S will not help predicate O
Two arrows into L L depends on S and O. Knowing
S and O will help predicate L.
5Inference in Bayes Network
- S 1, O 0, P(L) ?
- S 1, P(O) ?, P(L) ?
- L 1, P(S) ?, P(O) ?
- L 1, S 1, P(O) ?
6Conditional Independence
- Formal definition
- A and B are conditional independent given C iff
- Different from independence
- Example
- A shoe size
- B glove size
- C heigh
- Shoe size is not independent from glove size
C
B
A
7Distinguish Two Cases
- A shoe size
- B glove size
- C heigh
C
B
A
Given C A and B are independent Without C A and
B can be dependent
- S It is sunny
- L Ali arrives slightly late
- O Slides are put on web late
Without L S and O are independent Given L S and
O can be dependent
8Another Example for Bayes Nets
- Inference questions
- W1, P(R) ?
- W 1, P(C) ?
- W 1, C 1, P(S) ?, P(C) ?, P(S,R) ?
9Bayes Nets Formalized
- A Bayes net (also called a belief network) is an
augmented directed acyclic graph, represented by
the pair V , E where - V is a set of vertices.
- E is a set of directed edges joining vertices.
No loops of any length are allowed. - Each vertex in V contains the following
information - The name of a random variable
- A probability distribution table indicating how
the probability of this variables values depends
on all possible combinations of parental values.
10Building a Bayes Net
- Choose a set of relevant variables.
- Choose an ordering for them
- Assume theyre called X1 .. Xm (where X1 is the
first in the ordering, X1 is the second, etc) - For i 1 to m
- Add the Xi node to the network
- Set Parents(Xi ) to be a minimal subset of
X1Xi-1 such that we have conditional
independence of Xi and all other members of
X1Xi-1 given Parents(Xi ) - Define the probability table of
- P(Xi k ? Assignments of Parents(Xi ) ).
11Example of Building Bayes Nets
- Suppose were building a nuclear power station.
- There are the following random variables
- GRL Gauge Reads Low.
- CTL Core temperature is low.
- FG Gauge is faulty.
- FA Alarm is faulty
- AS Alarm sounds
- If alarm working properly, the alarm is meant to
sound if the gauge stops reading a low temp. - If gauge working properly, the gauge is meant to
read the temp of the core.
12Bayes Net for Power Station
GRL Gauge Reads Low. CTL Core temperature is
low. FG Gauge is faulty. FA Alarm is
faulty AS Alarm sounds
13Inference with Bayes Nets
- Key issue computing joint probability
- P(X1x1 X2x2 .Xn-1xn-1 Xnxn)
- Using the conditional independence relations to
simplify the computation
14Example for Inference
- Inference questions
- W1, P(R) ?
- W 1, P(C) ?
- W 1, C 1, P(S) ?, P(C) ?, P(S,R) ?
15Problem with Inference using Bayes Nets
- Inference
- Infer from observations EO to unknown variables Eu
Suppose you have m binary-valued variables in
your Bayes Net and expression Eo mentions k
variables. How much work is the above computation?
16Problem with Inference using Bayes Nets
- General querying of Bayes nets is NP-complete.
- Some solutions
- Belief propagation
- Take advantage of the structure of Bayes nets
- Stochastic simulation
- Similar to the sampling approaches for Bayesian
average
17More Interesting Questions
- Learning Bayes nets
- Given the topological structure of a Bayes net,
learn all the conditional probability tables from
examples - Example Hierarchical mixture model
- Learning the topological structure of Bayes net
- Very very hard question
- Unfortunately, the lecturer does not have enough
knowledge to teach you if he wants to !
18Learning Cond. Probabilities in Bayes Nets
- Three types of training examples
- C, S, R, W
- C, R, W
- S, C, W
- Maximum likelihood approach for estimating the
conditional probabilities - EM algorithm for optimization