Bayesian Networks - PowerPoint PPT Presentation

About This Presentation
Title:

Bayesian Networks

Description:

Root nodes = nodes without predecessors. prior probability table. Non-root nodes. conditional probabilites for all predecessors. Pearl Alarm Example. B - A E - A ... – PowerPoint PPT presentation

Number of Views:13
Avg rating:3.0/5.0
Slides: 13
Provided by: ics5
Learn more at: https://ics.uci.edu
Category:

less

Transcript and Presenter's Notes

Title: Bayesian Networks


1
Bayesian Networks
  • What is the likelihood of X given evidence E?
    i.e. P(XE) ?

2
Issues
  • Representational Power
  • allows for unknown, uncertain information
  • Inference
  • Question What is Probability of X if E is true.
  • Processing in general, exponential
  • Acquisition or Learning
  • network human input
  • probabilities data learning

3
Bayesian Network
  • Directed Acyclic Graph
  • Nodes are RVs
  • Edges denote dependencies
  • Root nodes nodes without predecessors
  • prior probability table
  • Non-root nodes
  • conditional probabilites for all predecessors

4
Pearl Alarm Example
  • B -gt A E -gtA A-gtJC A-gtMC
  • P(B) .001 P(-B) .999
  • P(E) .002 P(-E) .998 etc.
  • P(ABE) .95 P(AB-E) .94
  • P(A-BE) .29 P(A-B-E) .001
  • P(JCA) .90 P(JC-A) .05
  • P(MCA) .70 P(MC-A) .01

5
Joint Probability yields all
  • Event fully specified values for RVs.
  • Prob of event P(x1,x2,..xn)
    P(x1Parents(X1)..P(xnParents(Xn))
  • E.g. P(jma-b-e)
  • P(ja)P(ma)P(a-b-e)P(-b)P(-e)
  • .9.7.001.999..998 .00062.
  • Do this for all events and then sum as needed.
  • Yield exact probability

6
Summing out
  • P(b) from full joint sum over all events with b
    true (marginalization)
  • Silly must be .001.
  • P(MCJC) linked by alarm, sum still too much
    but not so apparent
  • Need to Answer what RVs are independent of
    others depending on the evidence.
  • Were skipping Market Blankets

7
Probability Calculation Cost
  • For example, P( 5 boolean variables) requires 25
    entries. In general 2n.
  • For Bayes net, only need tables for all
    conditional probabilities and priors.
  • If max k inputs to a node, and n RVs, then need
    n2k table entries.
  • Computation can be reduced, but difficult.

8
Approximate Inference
  • Simple Sampling
  • Use BayesNetwork as a generative model
  • Eg. generate 10000 examples, via topological
    order.
  • Generates examples with appropriate distribution.
  • Now use examples to estimate probabilities.

9
Sampling -gt probabilities
  • Generate examples with proper probability
    density.
  • Use the ordering of the nodes to construct
    events.
  • Finally use counting to yield an estimate of the
    exact probability.

10
Estimate P(JCMC)
  1. Do a large number of times
  2. Use prior tables to compute root events
  3. Say b f and e g
  4. Use conditional tables to compute internal nodes
    values (IN ORDER)
  5. Say a false
  6. Say JC false and MC true
  7. Count the number of appropriate events
  8. Note many entries irrelevant In this case only
    if MC true is event considered. Markov Chain
    Monte Carlo will only construct appropriate
    events.

11
Confidence of Estimate
  • Given n examples and k are heads.
  • How many examples needed to be 99 certain that
    k/n is within .01 of the true p.
  • From statistic Mean np, Variance npq
  • For confidence of .99, t 3.25 (table)
  • 3.25sqrt(pq/N) lt .01 gt N gt6,400.
  • Correct probabilities not needed just correct
    ordering.

12
Applications
  • Bayesian Networks extended to decision theory.
  • decision nodes have actions attached
  • value nodes indicate expect utility
  • Pathfinder (heckerman) medical diagnosis
  • adds utility theory (decision theory)
  • some actions specific tests
  • 60 disease, 130 features
  • Research Arena
Write a Comment
User Comments (0)
About PowerShow.com