Title: Reasoning Under Uncertainty
1Reasoning Under Uncertainty
- Artificial Intelligence
- CMSC 25000
- February 19, 2008
2Agenda
- Motivation
- Reasoning with uncertainty
- Medical Informatics
- Probability and Bayes Rule
- Bayesian Networks
- Noisy-Or
- Decision Trees and Rationality
- Conclusions
3Uncertainty
- Search and Planning Agents
- Assume fully observable, deterministic, static
- Real World
- Probabilities capture Ignorance Laziness
- Lack relevant facts, conditions
- Failure to enumerate all conditions, exceptions
- Partially observable, stochastic, extremely
complex - Can't be sure of success, agent will maximize
- Bayesian (subjective) probabilities relate to
knowledge
4Motivation
- Uncertainty in medical diagnosis
- Diseases produce symptoms
- In diagnosis, observed symptoms gt disease ID
- Uncertainties
- Symptoms may not occur
- Symptoms may not be reported
- Diagnostic tests not perfect
- False positive, false negative
- How do we estimate confidence?
5Motivation II
- Uncertainty in medical decision-making
- Physicians, patients must decide on treatments
- Treatments may not be successful
- Treatments may have unpleasant side effects
- Choosing treatments
- Weigh risks of adverse outcomes
- People are BAD at reasoning intuitively about
probabilities - Provide systematic analysis
6Probability Basics
- The sample space
- A set O ?1, ?2, ?3, ?n
- E.g 6 possible rolls of die
- ?i is a sample point/atomic event
- Probability space/model is a sample space with an
assignment P(?) for every ? in O s.t. 0lt
P(?)lt1 S ?P(?) 1 - E.g. P(die roll lt 4)1/61/61/61/2
7Random Variables
- A random variable is a function from sample
points to a range (e.g. reals, bools) - E.g. Odd(1) true
- P induces a probability distribution for any r.v
X - P(Xxi) S?X(?)xiP(?)
- E.g. P(Oddtrue)1/61/61/61/2
- Proposition is event (set of sample pts) s.t.
proposition is true e.g. event a A(?)true
8Why probabilities?
- Definitions imply that logically related events
have related probabilities - In AI applications, sample points are defined by
set of random variables - Random vars boolean, discrete, continuous
9Prior Probabilities
- Prior probabilities belief prior to evidence
- E.g. P(cavityt)0.2 P(weathersunny)0.6
- Distribution gives values for all assignments
- Joint distribution on set of r.v.s gives
probability on every atomic event of r.v.s - E.g. P(weather,cavity)4x2 matrix of values
- Every question about a domain can be answered
with joint b/c every event is a sum of sample pts
10Conditional Probabilities
- Conditional (posterior) probabilities
- E.g. P(cavitytoothache) 0.8, given only that
- P(cavitytoothache)2 elt vector of 2 elt vectors
- Can add new evidence, possibly irrelevant
- P(ab) P(ab)/P(b) where P(b) ?0
- Also, P(ab)P(ab)P(b)P(ba)P(a)
- Product rule generalizes to chaining
11Inference By Enumeration
12Inference by Enumeration
13Inference by Enumeration
14Independence
15Conditional Independence
16Conditional Independence II
17Probabilities Model Uncertainty
- The World - Features
- Random variables
- Feature values
- States of the world
- Assignments of values to variables
- Exponential in of variables
- possible states
18Probabilities of World States
- Joint probability of assignments
- States are distinct and exhaustive
- Typically care about SUBSET of assignments
- aka Circumstance
- Exponential in of dont cares
19A Simpler World
- 2n world states Maximum entropy
- Know nothing about the world
- Many variables independent
- P(strep,ebola) P(strep)P(ebola)
- Conditionally independent
- Depend on same factors but not on each other
- P(fever,coughflu) P(feverflu)P(coughflu)
20Probabilistic Diagnosis
- Question
- How likely is a patient to have a disease if they
have the symptoms? - Probabilistic Model Bayes Rule
- P(DS) P(SD)P(D)/P(S)
- Where
- P(SD) Probability of symptom given disease
- P(D) Prior probability of having disease
- P(S) Prior probability of having symptom
21Diagnosis
- Consider Meningitis
- Disease Meningitis m
- Symptom Stiff neck s
- P(sm) 0.5
- P(m) 0.0001
- P(s) 0.1
- How likely is it that someone with a stiff neck
actually has meningitis?
22Modeling (In)dependence
- Simple, graphical notation for conditional
independence compact spec of joint - Bayesian network
- Nodes Variables
- Directed acyclic graph link directly
influences - Arcs Child depends on parent(s)
- No arcs independent (0 incoming only a priori)
- Parents of X
- For each X need
23Example I
24Simple Bayesian Network
Need P(A) P(BA) P(CA) P(DB,C) P(EC)
Truth table 2 22 22 222 22
A only a priori B depends on A C depends on A D
depends on B,C E depends on C
25Simplifying with Noisy-OR
- How many computations?
- p parents k values for variable
- (k-1)kp
- Very expensive! 10 binary parents2101024
- Reduce computation by simplifying model
- Treat each parent as possible independent cause
- Only 11 computations
- 10 causal probabilities leak probability
- Some other cause
26Noisy-OR Example
Pn(ba) 1-(1-ca)(1-L) Pn(ba)
(1-ca)(1-L) Pn(ba) 1-(1 -L) L 0.5 Pn(ba)
(1-L)
P(BA)
b b
Pn(ba) 1-(1-ca)(1-L)0.6
(1-ca)(1-L)0.4 (1-ca)
0.4/(1-L) 0.4/0.50.8
ca 0.2
a a
0.6 0.4 0.5 0.5
27Noisy-OR Example II
Full model P(cab)P(cab)P(cab)P(cab) neg
Assume P(a)0.1 P(b)0.05 Pn(cab)0.3 ca
0.5 Pn(cb) 0.7
Noisy-Or ca, cb, L
Pn(cab) 1-(1-ca)(1-cb)(1-L) Pn(cab)
1-(1-cb)(1-L) Pn(cab) 1-(1-ca)(1-L) Pn(cab)
1-(1-L)
L 0.3
Pn(cb)Pn(cab)P(a)Pn(cab)P(a)
1-0.7(1-ca)(1-cb)(1-L)0.1(1-cb)(1-L)0.9
0.30.5(1-cb)0.07(1-cb)0.70.9
0.035(1-cb)0.63(1-cb)0.665(1-cb) 0.55cb
28Graph Models
- Bipartite graphs
- E.g. medical reasoning
- Generally, diseases cause symptom (not reverse)
29Topologies
- Generally more complex
- Polytree One path between any two nodes
- General Bayes Nets
- Graphs with undirected cycles
- No directed cycles - cant be own cause
- Issue Automatic net acquisition
- Update probabilities by observing data
- Learn topology use statistical evidence of
indep, heuristic search to find most probable
structure
30Holmes Example (Pearl)
Holmes is worried that his house will be burgled.
For the time period of interest, there is a
10-4 a priori chance of this happening, and
Holmes has installed a burglar alarm to try to
forestall this event. The alarm is 95 reliable
in sounding when a burglary happens, but also has
a false positive rate of 1. Holmes neighbor,
Watson, is 90 sure to call Holmes at his office
if the alarm sounds, but he is also a bit of a
practical joker and, knowing Holmes concern,
might (30) call even if the alarm is silent.
Holmes other neighbor Mrs. Gibbons is a
well-known lush and often befuddled, but Holmes
believes that she is four times more likely to
call him if there is an alarm than not.
31Holmes Example Model
There a four binary random variables B whether
Holmes house has been burgled A whether his
alarm sounded W whether Watson called G whether
Gibbons called
32Holmes Example Tables
B t Bf
Wt Wf
A t f
0.0001 0.9999
0.90 0.10 0.30 0.70
At Af
B t f
Gt Gf
A t f
0.95 0.05 0.01 0.99
0.40 0.60 0.10 0.90
33Decision Making
- Design model of rational decision making
- Maximize expected value among alternatives
- Uncertainty from
- Outcomes of actions
- Choices taken
- To maximize outcome
- Select maximum over choices
- Weighted average value of chance outcomes
34Gangrene Example
Medicine
Amputate foot
Worse 0.25
Full Recovery 0.7 1000
Die 0.05 0
Die 0.01
Live 0.99
850
0
Medicine
Amputate leg
Live 0.6 995
Live 0.98 700
Die 0.4 0
Die 0.02 0
35Decision Tree Issues
- Problem 1 Tree size
- k activities 2k orders
- Solution 1 Hill-climbing
- Choose best apparent choice after one step
- Use entropy reduction
- Problem 2 Utility values
- Difficult to estimate, Sensitivity, Duration
- Change value depending on phrasing of question
- Solution 2c Model effect of outcome over lifetime
36Conclusion
- Reasoning with uncertainty
- Many real systems uncertain - e.g. medical
diagnosis - Bayes Nets
- Model (in)dependence relations in reasoning
- Noisy-OR simplifies model/computation
- Assumes causes independent
- Decision Trees
- Model rational decision making
- Maximize outcome Max choice, average outcomes
37Holmes Example (Pearl)
Holmes is worried that his house will be burgled.
For the time period of interest, there is a
10-4 a priori chance of this happening, and
Holmes has installed a burglar alarm to try to
forestall this event. The alarm is 95 reliable
in sounding when a burglary happens, but also has
a false positive rate of 1. Holmes neighbor,
Watson, is 90 sure to call Holmes at his office
if the alarm sounds, but he is also a bit of a
practical joker and, knowing Holmes concern,
might (30) call even if the alarm is silent.
Holmes other neighbor Mrs. Gibbons is a
well-known lush and often befuddled, but Holmes
believes that she is four times more likely to
call him if there is an alarm than not.
38Holmes Example Model
There a four binary random variables B whether
Holmes house has been burgled A whether his
alarm sounded W whether Watson called G whether
Gibbons called
39Holmes Example Tables
B t Bf
Wt Wf
A t f
0.0001 0.9999
0.90 0.10 0.30 0.70
At Af
B t f
Gt Gf
A t f
0.95 0.05 0.01 0.99
0.40 0.60 0.10 0.90