Title: Understanding Confirmation with Probability
1Understanding Confirmation with Probability
2Kolmogorovs Axioms
1. P(A) 0 for all A e S. 2. If T is a logical
truth, then P(T) 1 3. P(A ? B) P(A) P(B)
for all A, B such that (A ? B) is contradictory.
3Conditional Probability
P(HE) the probability of H, given E P(HE)
P(H ? E) P(E) Degree of TOTAL
CONFIRMATION P(HE) Degree of INCREMENTAL
CONFIRMATION P(HE) P(H)
4Example
P(H ? E) .01 P(H ? E) .04 P(H ? E)
.02 P(H ? E) .93
The degree of TOTAL confirmation that H has is
the probability that H is true, assuming that E
is true, where E is all your evidence. If E
makes H more likely, than H is on its own, then E
INCREMENTALLY confirms H.
5Bayes Theorem
P(HE) P(EH) P(H)
P(EH) P(H) P(EH) P(H)
6Bayes Theorem
P(HE) P(EH) P(H)
P(EH) P(H) P(EH) P(H)
7Bayes Theorem
P(HE) P(EH) P(H)
P(EH) P(H) P(EH) P(H)
This all seems to work well. But where do we get
the probabilities from?
8Interpretations of Probability
- Logical
- Frequency
- Objective Chance
- Subjective Degrees of Belief
9Logical Probability
Probabilities are assigned to different
possibilities using some logical method.
10Logical Probability
11Logical Probability
12Logical Probability
13Logical Probability
14H Everything is F. E Fb P(HE) P(H ?
E)/P(E) (1/3)/(1/2) 2/3 gt 1/3 P(H). Thus,
our evidence confirms our hypothesis.
15Problems for Logical Interpretation
1. Why this distribution of probability? There
are other distributions, so how is this
logical? Â 2. Why this language to describe
things? If we describe things in a different
language, we can get different results. This is
the lesson from Goodman. Â 3. If degree of
confirmation depends on distribution of
probability and language chosen, then how can
this guide science?
16Frequency and Objective Chance
We dont have time, but these arent going to be
satisfactory. They wont assign probabilities to
things that we want probabilities assigned to.
17Subjective Degree of Belief
Probabilities are identified with degrees of
confidence, or degrees of beliefs of suitable
agents. Suitable agents are those that are
rational.
18What is a degree of belief?
Betting Interpretation Your degree of belief in
some proposition, Q, is equal to X/(X1) where
X1 are the odds you deem fair for a bet on Q.
Example Your fair odds for a bet on Seabiscuit
winning are 31. Let S Seabiscuit wins.
Pyou(S) 3/(31) ¾ 0.75
19Understanding Confirmation with Probability
20Basic Notions
TOTAL CONFIRMATION P(HE) INCREMENTAL
CONFIRMATION P(HE) gt P(H)
The degree of TOTAL confirmation that H has is
the probability that H is true, assuming that E
is true, where E is all your evidence. If E
makes H more likely, than H is on its own, then E
INCREMENTALLY confirms H.
21What are these Probabilities?
A Degrees of belief/degrees of confidence of
rational agents.
What is a degree of belief?
A Degree of belief in P is understood in terms
of the odds that you deem fair for a bet on P
being true.
Pyou(P) X/(X1) when X1 are the odds you deem
fair for a bet on P.
22Example
Lets say that you believe that theres a 50-50
chance of rain (R). Further, you believe that
Brian Lapis is a pretty good weatherman. You
think that the probability that L Lapis says it
will rain, given R that it will rain, is fairly
high. Finally, since youre such an avid viewer
of 22News, you know that Lapis tends to predict
rain (L) about 45 of the time. We have P(R)
.5P(L/R) .8P(L) .45
P(R/L) P(L/R)P(R) 0.8 0.5 ? .89
P(L) 0.45
P(R/L) gt P(R)
23What are rational degrees of belief?
Rational degrees of belief are those that satisfy
the Kolmogorov axioms. For example, rational
degrees of belief are such that P(H) P(H)
1 Why?
24Dutch Book Argument
- Assume Pjeff(S) Pjeff(S) ¾.
- Thus, Pjeff(S) Pjeff(S) gt 1. (So, I violate
Kolmogorovs Axioms). - Given my degrees of belief, and the betting
interpretation of degrees of belief, I will
accept 31 odds for a bet on S and 31 odds for a
bet on S. - Thus, I will accept the following two
bets (Bet 1) Pay 3 for a chance to win 1 if S
is true. (Bet 2) Pay 3 for a chance to win 1
if S is true. - If S is true, Ill win 1 from Bet 1, but lose
3 on Bet 2, for net loss of 2. - If S is true, Ill win 1 from Bet 2, but lose
3 on Bet 2, for net loss of 2.
25Dutch Book Argument
Basically 1. If you dont meet the Axioms, then
you can be Dutch-Booked. 2. If you can be
Dutch-Booked, then you arent rational. ----------
--------------------------------------------------
----------------- 3. Thus, if you dont meet the
Axioms, then you arent rational. Which is
equivalent to 3. If you are rational, then you
meet the Axioms.
26Advantages
- Assigns probabilities to events we care
about. (So, better than frequency and objective
chance.) - No problem with how to assign probabilities. (So,
better than logical approach.)
27Potential Problems
i. Problem of the priors ii. Problem of
likelihoods iii. Idealizations in modeling degree
of belief iv. Language Incommensurability not
addressed
28Problem of the Priors
If everyone gives different hypotheses different
probabilities, then how is this an advantage over
logical probability? Answer Convergence
Theorems Given that we agree on the likelihoods
P(EH)s but disagree about how probable
different hypotheses are, there is some amount of
evidence that will lead us to have arbitrarily
close agreement about the probability of
hypotheses.
29Convergence
0 ravens 1 raven 3 ravens 6 ravens 12 ravens 60
ravens
30Problem of the Likelihoods
Why think that there will be agreement about
likelihoods P(E/H) if there is not agreement
about anything else? Partial Response Some
likelihoods appear to be given by the theory
itself.
all should agree (at least approximately) on
the values of the likelihoods of evidence claims
given by specific hypotheses. For, the
likelihoods represent the empirical content of a
hypothesis - what the hypothesis says about
evidence claims. James Hawthorne, A Better
Bayesian Convergence Theorem
31Convergence Arguments
Rational agents will converge to the same degrees
of belief.
with positive probability in the same
propositions
in propositions the evidence distinguishes
between
in the long run, believe with probability 1 that
they will
with probability 1
32The Dialectic
Problem of the Priors. Response Convergence
Theorems. Problem of the Likelihoods. Respons
e Just what the theory says. Q But why do
priors matter?
33Goodmans Grue and Priors
E All emeralds observed have been green. G
All emeralds are green. g All emeralds are
grue.
P(EG) gt P(E). why? But P(Eg) gt P(E), too.
why? Further, P(EG) P(Eg). why? From this
it follows that P(EG)/P(E) P(Eg)/P(E)
why? Now, we want to know whether to believe G
or g. So we need to look at P(GE) and P(gE).
P(GE) P(EG)/P(E) P(G) P(gE)
P(Eg)/P(E) P(g)
34Further Problems
iii. Idealizations in the model. iv. Language
Incommensurability. If scientists really
describe the evidence differently, then much of
the Bayesian machinery will not work.