Probabilistic Reasoning and Bayesian Networks - PowerPoint PPT Presentation

1 / 86
About This Presentation
Title:

Probabilistic Reasoning and Bayesian Networks

Description:

Wumpus World. OK. OK. OK. Pit? Wumpus. OK. Pit? Wumpus. Pit? Wumpus. OK. Pit? Wumpus. Pit? Wumpus. Pit? Wumpus. Pit? Wumpus. Pit? Wumpus. Pit? Wumpus. Pit? Wumpus ... – PowerPoint PPT presentation

Number of Views:244
Avg rating:3.0/5.0
Slides: 87
Provided by: yuelin
Category:

less

Transcript and Presenter's Notes

Title: Probabilistic Reasoning and Bayesian Networks


1
Probabilistic Reasoning and Bayesian Networks
  • Lecture Prepared For
  • COMP 790-058
  • Yue-Ling Wong

2
Probabilistic Robotics
  • A relatively new approach to robotics
  • Deals with uncertainty in robot perception and
    action
  • The key idea is to represent uncertainty
    explicitly using the calculus of probability
    theory
  • i.e. represent information by probability
    distributions over a whole space of guesses,
    instead of relying on a single "best guess"

3
3 Parts of this Lecture
  • Part 1. Acting Under Uncertainty
  • Part 2. Bayesian Networks
  • Part 3. Probabilistic Reasoning in Robotics

4
Reference for Part 3
  • Sebastian Thrun, et. al. (2005) Probabilistic
    Robotics
  • The book covers major techniques and algorithms
    in localization, mapping, planning and control
  • All algorithms in the book are based on a single
    overarching mathematical foundations
  • Bayes rule
  • its temporal extension known as Bayes filters

5
Goals of this lecture
  • To introduce this overarching mathematical
    foundations Bayes rule and its temporal
    extension known as Bayes filters
  • To show how Bayes rule and Bayes filters are used
    in robotics

6
Preliminaries
  • Part 1
  • Probability theory
  • Bayes rule
  • Part 2
  • Bayesian Networks
  • Dynamic Bayesian Networks

7
Outline of this lecture
  • Part 1. Acting Under Uncertainty (October 20)
  • To go over fundamentals on probability theory
    that is necessary to understand the materials of
    Bayesian reasoning
  • Start with AI perspective and without adding the
    temporal aspect of robotics
  • Part 2. Bayesian Networks (October 22)
  • DAG representation of random variables
  • Dynamic Bayesian Networks (DBN) to handle
    uncertainty and changes over time
  • Part 3. Probabilistic Reasoning in Robotics
    (October 22)
  • To give you general ideas of how DBN is used in
    robotics to handle the changes of sensor and
    control data over time in making inferences
  • Demonstrate use of Bayes rule and Bayes filter in
    a simple example of mobile robot monitoring the
    status (open or closed) of doors

8
Historical Background and Applications of
Bayesian Probabilistic Reasoning
  • Bayesian probabilistic reasoning has been used in
    AI since 1960, especially in medical diagnosis
  • One system outperformed human experts in the
    diagnosis of acute abdominal illness(de Dombal,
    et. al. British Medical Journal, 1974)

9
Historical Background and Applications of
Bayesian Probabilistic Reasoning
  • Directed Acyclic Graph (DAG) representation for
    Bayesian reasoning started in the 1980's
  • Example systems using Bayesian networks
    (1980's-1990's)
  • MUNIN system diagnosis of neuromuscular
    disorders
  • PATHFINDER system pathology

10
Historical Background and Applications of
Bayesian Probabilistic Reasoning
  • NASA AutoClass for data analysishttp//ti.arc.nas
    a.gov/project/autoclass/autoclass-c/finds the
    set of classes that is maximally probable with
    respect to the data and model
  • Bayesian techniques are utilized to calculate the
    probability of a call being fraudulent at ATT

11
Historical Background and Applications of
Bayesian Probabilistic Reasoning
  • By far the most widely used Bayesian network
    systems
  • The diagnosis-and-repair modules (e.g. Printer
    Wizard) in Microsoft Windows(Breese and
    Heckerman (1996). Decision-theoretic
    troubleshooting A framework for repair and
    experiment. In Uncertainty in Artificial
    Intelligence Proceedings of the Twelfth
    Conference, pp. 124-132)
  • Office Assistant in Microsoft Office(Horvitz,
    Breese, Heckerman, and Hovel (1998). The Lumiere
    project Bayesian user modeling for inferring the
    goals and needs of software users. In Uncertainty
    in Artificial Intelligence Proceedings of the
    Fourteenth Conference, pp. 256-265.http//researc
    h.microsoft.com/horvitz/lumiere.htm)
  • Bayesian inference for e-mail spam filtering

12
Historical Background and Applications of
Bayesian Probabilistic Reasoning
  • An important application of temporal probability
    models Speech recognition

13
References and Sources of Figures
  • Part 1Stuart Russell and Peter Norvig,
    Artificial Intelligence A Modern Approach, 2nd
    ed., Prentice Hall, Chapter 13
  • Part 2Stuart Russell and Peter Norvig,
    Artificial Intelligence A Modern Approach, 2nd
    ed., Prentice Hall, Chapters 14 15
  • Part 3Sebastian Thrun, Wolfram Burgard, and
    Dieter Fox, Probabilistic Robotics, Chapter 2

14
Part 1 of 3 Acting Under Uncertainty
15
Uncertainty Arises
  • The agent's sensors give only partial, local
    information about the world
  • Existence of noise of sensor data
  • Uncertainty in manipulators
  • Dynamic aspects of situations (e.g. changes over
    time)

16
Degree of Belief
  • An agent's knowledge can at best provide only a
    degree of belief in the relevant sentences.
  • One of the main tools to deal with degrees of
    belief will be probability theory.

17
Probability Theory
  • Assigns to each sentence a numerical degree of
    belief between 0 and 1.

18
In Probability Theory
  • You may assign 0.8 to the a sentence"The
    patient has a cavity."
  • This means you believe"The probability that the
    patient has a cavity is 0.8."
  • It depends on the percepts that the agent has
    received to date.
  • The percepts constitute the evidence on which
    probability assessments are based.

19
Versus In Logic
  • You assign true or false to the same
    sentence.True or false depends on the
    interpretation and the world.

20
Terminology
  • Prior or unconditional probability
  • The probability before the evidence is obtained.
  • Posterior or conditional probability
  • The probability after the evidence is obtained.

21
Example
  • Suppose the agent has drawn a card from a
    shuffled deck of cards.
  • Before looking at the card, the agent might
    assign a probability of 1/52 to its being the ace
    of spades.
  • After looking at the card, the agent has obtained
    new evidence. The probability for the same
    proposition (the card being the ace of spades)
    would be 0 or 1.

22
Terminology and Basic Probability Notation
23
Terminology and Basic Probability Notation
  • PropositionAscertain that such-and-such is the
    case.

24
Terminology and Basic Probability Notation
  • Random variableRefers to a "part" of the world
    whose "status" is initially unknown.Example
    Cavity might refer to whether the patient's lower
    left wisdom tooth has a cavity.Convention used
    here Capitalize the names of random variables.

25
Terminology and Basic Probability Notation
  • Domain of a random variableThe collection of
    values that a random variable can take
    on.Example The domain of Cavity might be
    ?true, false?The domain of Weather might be
    ?sunny, rainy, cloudy, snow?

26
Terminology and Basic Probability Notation
  • Abbreviations used here
  • cavity to represent Cavity true
  • ?cavity to represent Cavity false
  • snow to represent Weather snow
  • cavity ? ?toothache to represent Cavitytrue ?
    Toothachefalse

27
Terminology and Basic Probability Notation
  • cavity ? ?toothache
  • or
  • Cavitytrue ? Toothachefalse
  • is a proposition that may be assigned with a
    degree of belief

28
Terminology and Basic Probability Notation
  • Prior or unconditional probabilityThe degree of
    belief associated with a proposition in the
    absence of any other information.Examplep(Cavi
    tytrue) 0.1 or p(cavity) 0.1

29
Terminology and Basic Probability Notation
  • p(Weathersunny) 0.7
  • p(Weatherrain) 0.2
  • p(Weathercloudy) 0.08
  • p(Weathersnow) 0.02
  • or we may simply write
  • P(Weather) ?0.7, 0.2, 0.08, 0.02?

30
Terminology and Basic Probability Notation
  • Prior probability distributionA vector of values
    for the probabilities of each individual state of
    a random variableExample This denotes a prior
    probability distribution for the random variable
    Weather.P(Weather) ?0.7, 0.2, 0.08, 0.02?

31
Terminology and Basic Probability Notation
  • Joint probability distributionThe probabilities
    of all combinations of the values of a set of
    random variables.P(Weather, Cavity)
  • denotes the probabilities of all combinations of
    the values of a set of random variables Weather
    and Cavity.

32
Terminology and Basic Probability Notation
  • P(Weather, Cavity)
  • can be represented by a 4x2 table of
    probabilities.

33
Terminology and Basic Probability Notation
  • Full joint probability distributionThe
    probabilities of all combinations of the values
    of the complete set of random variables.

34
Terminology and Basic Probability Notation
  • Example Suppose the world consists of just the
    variables Cavity, Toothache, and
    Weather.P(Cavity, Toothache, Weather)
  • denotes the full joint probability distribution
    which can be represented as a 2x2x4 table with 16
    entries.

35
Terminology and Basic Probability Notation
  • Posterior or conditional probabilityNotation
    p(ab)Read as "The probability of proposition
    a, given that all we know is proposition b."

36
Terminology and Basic Probability Notation
  • Examplep(cavitytoothache) 0.8Read as"If
    a patient is observed to have a toothache and no
    other information is yet available, then the
    probability of the patient's having a cavity will
    be 0.8."

37
Terminology and Basic Probability Notation
  • Equation
  • where p(b) gt 0

38
Terminology and Basic Probability Notation
  • Product rule
  • which is rewritten from the previous equation

39
Terminology and Basic Probability Notation
  • Product rule
  • can also be written the other way around

40
Intuition
Terminology and Basic Probability Notation
cavity ? toothache
cavity
toothache
41
Intuition
Terminology and Basic Probability Notation
cavity ? toothache
cavity
toothache
42
Derivation of Bayes' Rule
43
Terminology and Basic Probability Notation
  • Bayes' rule, Bayes' law, or Bayes' theorem

44
Bayesian Spam Filtering
  • Given that it has certain words in an email, the
    probability that the email is spam is equal to
    the probability of finding those certain words in
    spam email, times the probability that any email
    is spam, divided by the probability of finding
    those words in any email

45
Speech Recognition
  • Given the acoustic signal, the probability that
    the signal corresponds to the words is equal to
    the probability of getting the signal with the
    words, times the probability of finding those
    words in any speech, times a normalization
    coefficient

46
Terminology and Basic Probability Notation
  • Conditional distributionNotation P(XY)It
    gives the values of p(Xxi Yyj) for each
    possible i, j.

47
Terminology and Basic Probability Notation
  • Conditional distributionExample P(X,Y)
    P(XY)P(Y)denotes a set of equationsp(Xx1 ?
    Yy1) p(Xx1 Yy1)p(Yy1)p(Xx1 ? Yy2)
    p(Xx1 Yy2)p(Yy2)...

48
Probabilistic InferenceUsing Full Joint
Distributions
49
Terminology and Basic Probability Notation
  • Simple dentist diagnosis example.
  • 3 Boolean variables
  • Toothache
  • Cavity
  • Catch (the dentist's steel probe catches in the
    patient's tooth)

50
A full joint distribution for the Toothache,
Cavity, Catch world
51
Getting information from the full joint
distribution
p(cavity ? toothache) 0.108 0.012 0.072
0.008 0.016 0.064 0.28
52
Getting information from the full joint
distribution
p(cavity) 0.108 0.012 0.072 0.008 0.2
unconditional or marginal probability
53
Marginalization, Summing Out, Theorem of Total
Probability, and Conditioning
54
Getting information from the full joint
distribution
p(cavity) 0.108 0.012 0.072 0.008 0.2
p(cavity, catch, toothache) p(cavity, ?catch,
toothache) p(cavity, catch, ?toothache)
p(cavity, ?catch, ?toothache)
55
Marginalization Rule
  • Marginalization rule
  • For any sets of variables Y and Z,
  • A distribution over Y can be obtained by summing
    out all the other variables from any joint
    distribution containing Y.

56
A variant of the rule after applying the product
rule
  • Conditioning
  • For any sets of variables Y and Z,
  • Read as Y is conditioned on the variable Z.
  • Often referred to as Theorem of total probability.

57
Getting information from the full joint
distribution
The probability of a cavity, given evidence of a
toothache
conditional probability
58
Getting information from the full joint
distribution
The probability of a cavity, given evidence of a
toothache
conditional probability
59
Getting information from the full joint
distribution
The probability of a cavity, given evidence of a
toothache
conditional probability
60
Getting information from the full joint
distribution
The probability of a cavity, given evidence of a
toothache
conditional probability
61
Independence
62
Independence
  • If the propositions a and b are independent, then
  • p(ab) p(a)
  • p(ba) p(b)
  • p(a?b) p(a,b) p(a)p(b)
  • Think about the coin flipping example.

63
Independence Example
  • Suppose Weather and Cavity are independent.
  • p(cavity Weathercloudy) p(cavity)
  • p(Weathercloudy cavity) p(Weathercloudy)
  • p(cavity, Weathercloudy) p(cavity)p(Weatherclo
    udy)

64
Similarly
  • If the variables X and Y are independent, then
  • P(XY) P(X)
  • P(YX) P(Y)
  • P(X,Y) P(X)P(Y)

65
Normalization
66
Previous Example
The probability of a cavity, given evidence of a
toothache
67
Previous Example
The probability of a cavity, given evidence of a
toothache
68
Normalization
  • The term
  • remains constant, no matter which value of Cavity
    we calculate.
  • In fact, it can be viewed as a normalization
    constant for the distribution P(Cavitytoothache),
    ensuring that it adds up to 1.

69
Recall this example
The probability of a cavity, given evidence of a
toothache
70
Now, normalization simplifies the calculation
The probability distribution of Cavity, given
evidence of a toothache
71
Now, normalization simplifies the calculation
The probability distribution of Cavity, given
evidence of a toothache
72
Example of Probabilistic InferenceWumpus World
73
(No Transcript)
74
OK
OK
75
OK
76
Pit? Wumpus
OK
Pit? Wumpus
77
Pit? Wumpus
OK
Pit? Wumpus
78
Pit? Wumpus
Pit? Wumpus
79
Pit? Wumpus
Pit? Wumpus
Pit? Wumpus
80
Now what??
Pit? Wumpus
Pit? Wumpus
Pit? Wumpus
81
By applying Bayes' rule, you can calculate the
probabilities of these cells having a pit, based
on the known information.
0.31
Pit? Wumpus
Pit? Wumpus
0.86
Pit? Wumpus
0.31
82
To Calculate the Probability Distribution for
Wumpus Example
83
Let unknown be a composite variable consisting of
Pi,j variables for squares other than Known
squares and the query square 1,3
84
(No Transcript)
85
(No Transcript)
86
(No Transcript)
Write a Comment
User Comments (0)
About PowerShow.com