Induction and Fusion - PowerPoint PPT Presentation

1 / 13
About This Presentation
Title:

Induction and Fusion

Description:

(1 0 1 1), (0 0 0 1)*2, (0 0 1 0), (0 1 1 0), (1 1 0 0), (0 1 0 0), (1 0 0 0) bad ... (1 1 1 _) abc (1/2) 1 0 (1 1 0 _) ab c 1/2 1 1/8 (1 0 1 _) a bc 1 0 1/16 ... – PowerPoint PPT presentation

Number of Views:19
Avg rating:3.0/5.0
Slides: 14
Provided by: jfbal
Category:
Tags: abc1 | fusion | induction

less

Transcript and Presenter's Notes

Title: Induction and Fusion


1
Induction and Fusion
  • Professor J. F. Baldwin

2
L Example

twice
Notation
A
B
C
D
Good L
(A, B, C, D) Variables take value 1 if
black and 0 if white
twice
unknown
Predict if these unknown cases are good or bad
Bad L
Example Set
3
Results using ID3 Decision Tree
good L patterns (1 0 1 1), (0 0 0 1)2,
(0 0 1 0), (0 1 1 0), (1 1 0 0), (0 1
0 0), (1 0 0 0) bad L patterns (0 1 1
1), (0 0 1 0)2, (0 0 0 1), (1 0 0
1), (1 1 0 0) (1 0 0 0), (0 1 0 0)
Tree gives correct results for training data
eg Pr(good (0 0 1 0)) 1/3 -----
probabilities correct
Next slide shows tree
unknown patterns (1 0 1 0), (0 1 0 1), (1 1 0
1), (1 1 1 0), (1 1 1 1), (0 0 1 1)
Tree gives following results for unknown
classifications (1 0 1 0) good 1 (1 1 0 1)
good 0 (1 1 1 0) good 1 (0 1 0 1) good
2/3 (1 1 1 1) good 1 (0 0 1 1) good 0
Underline are not good results
4
ID3 Decision Tree
A
1
0
C
C
0
1
0
1
D
D
D
good
0
0
1
1
0
1
good2/3
B
bad
good1/2
good1/2
bad
0
1
good1/3
good
5
ID3 with cross product AttributesABC, ABD, ACD,
BCD
Entropy for ABC code example Pr(goodcode) Entropy
Branch Prob (1 1 1 _) abc (1/2) 1
0 (1 1 0 _) ab c 1/2 1 1/8 (1 0 1
_) a bc 1 0 1/16 (1 0 0 _) a
b c 1/3 0.9183 3/16 (0 1 1 _)
abc 1/2 1 1/8 (0 1 0 _) ab c
1/2 1 1/8 (0 0 1 _) a bc
1/3 0.9183 3/16 (0 0 0 _) a b c
2/3 0.9183 3/16
Expected entropy for ABD 0.8915 ACD
0.7972 BCD 0.7972
Expected entropy 0.8915
We can choose BCD or ACD as our first Attribute.
If we use Attribute BCD then (1 1 1 1) is
classified bad If we use Attribute ACD then (1 1
1 1) is classified good
ID3 with cross product Attributes cannot solve
this problem
6
Alternative Method of using Fril Extended Rules
Attribute ABC code example Pr(goodcode) (1 1 1
_) abc (1/2) (1 1 0 _) ab c 1/2 (1 0
1 _) a bc 1 (1 0 0 _) a b c 1/3 (0
1 1 _) abc 1/2 (0 1 0 _) ab c
1/2 (0 0 1 _) a bc 1/3 (0 0 0 _) a
b c 2/3
We can calculate Pr(good ABC) to give extended
Fril rule
ABC is abc ABC is ab c ABC is a bc ABC is a b
c ABC is abc ABC is ab c ABC is a bc ABC
is a b c )
(1/2)(1/2)(1)(1/3)(1/2)(1/2)(1/3)(2/3
good(ABC) IF (
7
Other Rules using similar calculation method
ABD is abd ABD is ab d ABD is a bd ABD is a b
d ABD is abd ABD is ab d ABD is a bd ABD
is a b d )
(1/2)(1/2)(1/2)(1/2)(0)(2/3)(2/3)(1/3)
good(ABD) IF (
good(ACD) IF (
ACD is acd ACD is ac d ACD is a cd ACD is a c
d ACD is acd ACD is ac d ACD is a cd ACD
is a c d )
(0)(1)(1/2)(1/2)(1)(1/2)(2/3)(1/2)
8
Last Rule and Fusion Rule
good(BCD) IF (
BCD is bcd BCD is bc d BCD is b cd BCD is b c
d BCD is bcd BCD is bc d BCD is b cd BCD
is b c d )
(0)(1)(1/2)(1/2)(1)(1/3)(1/2)(1/2)
Use equal weighted evidential logic rule
(A B C D) is good L IFF MOST
good(ABC) with weight 1/4 good(ABD) with weight
1/4 good(ACD with weight 1/4 good(BCD) with
weight 1/4 )
Evidential logic rule for fusion
(
9
Results for Training Set
Evidential logic rule gives following results for
Training set
pattern code rule result true result (1011) abcd
good 7/8 good (0001) abcd good
5/8 good (0010) abcd good
3/8 bad (0110) abcd good 2/3 good (1100) abc
d good 1/2 uncertain (0100) abcd good
13/24 uncertain (1000) abcd good
11/24 uncertain (0111) abcd good
1/8 bad (1001) abcd good 1/3 bad
To give correct decision for all this set choose
decision rule
good if Pr(good) gt 13/24 bad if Pr(good) lt
11/24 uncertain if 11/24 Pr(good) 13/24
10
Results for New Cases
pattern code rule result decision (1010) abcd
good 7/12 good (0101) abcd good
5.12 bad (1101) abcd good 3/8 bad (1110) abcd
good 5/8 good (1111) abcd good
1/2 uncertain (0011) abcd good 1/2 uncertain
Using rules plus Decision rule
These are good results
11
Comments
Equivalent to (x1 x2 x3 x4) -----gt good

1/4Pr(good x1 x2 x3) 1/4Pr(good x1 x2 x4)
1/4Pr(good x1 x3 x4) 1/4Pr(good x2 x3 x4)

This is an induction assumption - classical
probability theory cannot help. Pr(H AB) can
be low even if both Pr(H A) and Pr(H B) are
both high. Pr(H A) and Pr(H B) tells us
nothing about Pr(H AB)
The evidential logic rule above simply says that
the more agreement of the sub - parts of the
unknown pattern with the corresponding sub -
parts of those patterns which are good, the
greater the support that the unknown pattern
is good
12
Bayesian Solution
Pr(vector good L)Pr(good L)
Pr(good L vector)
Pr(vector)
patt X Pr(g X) patt X Pr(g X) (1011)
1 (0010) 1/3 (0111) 0 (0001) 2/3 (0110) 1
(1100) 1/2 (1001) 0 (0100) 1/2 (1000) 1/2
For Generalisation use error model one
transmission error Allowed. Test new case to see
if in example set. If it is stop, otherwise Allow
another transmission error and repeat until new
case is in Example set.
patt X Pr(g X) patt X Pr(g X) (1010)
x1 (1110) x4 (1111) x2 (1101) x5 (0101) x3 (0011)
x6
Let
13
Equations for xi and results
x1 Pr(good L (1010) Pr(good
(0010)Pr((0010) Pr(good
(1110)Pr((1110) Pr(good
(1000)Pr((1000) Pr(good
(1011)Pr((1011) (1/3 x4 1/2 1)/4 since
Pr(0010) etc are probabilities of each new
pattern from communication and each new pattern
is equally likely
x2 Pr(good L (1110) etc.
We obtain 6 equations for xi which can be
solved to give
Similar results to those using our new method
patt X Pr(g X) patt X Pr(g X) (1010)
0.6222 (1110) 0.6556 (1111) 0.5 (1101) 0.34
44 (0101) 0.3778 (0011) 0.5
Write a Comment
User Comments (0)
About PowerShow.com