Machine Learning - PowerPoint PPT Presentation

About This Presentation
Title:

Machine Learning

Description:

knowledge discovery. data mining. speed up learning. inductive learning. Introduction: ... Restaurant Meal Day Cost Reaction. Alma 3 breakfast Friday cheap Yes ... – PowerPoint PPT presentation

Number of Views:52
Avg rating:3.0/5.0
Slides: 62
Provided by: DannyDe8
Category:
Tags: learning | machine

less

Transcript and Presenter's Notes

Title: Machine Learning


1
Machine Learning
  • Version Spaces Learning

2
Introduction
  • Very many approaches to machine learning
  • Neural Net approaches
  • Symbolic approaches
  • version spaces
  • decision trees
  • knowledge discovery
  • data mining
  • speed up learning
  • inductive learning

3
Version Spaces
  • A concept learning technique based on refining
    models of the world.

4
Concept Learning
  • Example
  • a student has the following observations about
    having an allergic reaction after meals


Restaurant Meal Day Cost
Reaction Alma 3 breakfast Friday cheap Yes De
Moete lunch Friday expensive No Alma
3 lunch Saturday cheap Yes Sedes breakfast Sun
day cheap No Alma 3 breakfast Sunday expensive N
o

-

-
-
  • Concept to learn under which circumstances do
    I get an allergic reaction after meals ??

5
In general
  • There is a set of all possible events
  • Example
  • There is a boolean function (implicitly) defined
    on this set.
  • Example
  • We have the value of this function for SOME
    examples only.

6
Pictured
Set of all possible events
7
Non-determinism
  • Many different ways to solve this!
  • How to choose ?

8
An obvious, but bad choice

Restaurant Meal Day Cost
Reaction Alma 3 breakfast Friday cheap Yes De
Moete lunch Friday expensive No Alma
3 lunch Saturday cheap Yes Sedes breakfast Sun
day cheap No Alma 3 breakfast Sunday expensive N
o
  • The concept IS
  • Alma 3 and breakfast and Friday and cheap OR
  • Alma 3 and lunch and Saturday and cheap
  • Does NOT generalize the examples any !!

9
Pictured
  • Only the positive examples are positive !

Set of all possible events



-

-
-
-
10
Equally bad is

Restaurant Meal Day Cost
Reaction Alma 3 breakfast Friday cheap Yes De
Moete lunch Friday expensive No Alma
3 lunch Saturday cheap Yes Sedes breakfast Sun
day cheap No Alma 3 breakfast Sunday expensive N
o
  • The concept is anything EXCEPT
  • De Moete and lunch and Friday and
    expensive AND
  • Sedes and breakfast and Sunday and cheap AND
  • Alma 3 and breakfast and Sunday and expensive

11
Pictured
  • Everything except the negative examples are
    positive

Set of all possible events
?
?
?
?

?
?

?
?
?

?
?
-
?

?
?
?
?
?
?
-
?
?
?
?
-
?
?
?
?
-
?
?
?
?
?
?
?
?
12
Solution fix a language of hypotheses
  • We introduce a fix language of concept
    descriptions.

13
Reaction - Example

Restaurant Meal Day Cost
Reaction Alma 3 breakfast Friday cheap Yes De
Moete lunch Friday expensive No Alma
3 lunch Saturday cheap Yes Sedes breakfast Sun
day cheap No Alma 3 breakfast Sunday expensive
No

-

-
-
  • Every hypothesis is a 4-tuple
  • maximally specific ex. Sedes, lunch, Monday,
    cheap
  • combinations of ? and values are allowed
    ex. De Moete, ?, ?, expensive
  • or ?, lunch, ?, ?
  • One more hypothesis ? (bottom denotes no
    example)

14
Hypotheses relate to sets of possible events
15
Expressive power of this hypothesis language
  • Conjunctions of explicit, individual properties
  • Examples
  • ?, lunch, ?, cheap Meal lunch ? Cost
    cheap
  • ?, lunch, ?, ? Meal lunch
  • In addition to the 2 special hypotheses
  • ? nothing

16
Other languages of hypotheses are allowed
  • Example identify the color of a given sequence
    of colored objects.
  • A useful language of hypotheses

17
Important about hypothesis languages
  • They should have a specific lt-gt general
    ordering. Corresponding to the
    set-inclusion of the events they cover.

18
Defining Concept Learning
  • Given
  • A set X of possible events
  • Ex. Eat-events ltRestaurant, Meal, Day, Costgt
  • An (unknown) target function c X -gt -,
  • Ex. Reaction Eat-events -gt -,
  • A language of hypotheses H
  • Ex. conjunctions ?, lunch, Monday, ?
  • A set of training examples D, with their value
    under c
  • Ex. (ltAlma 3, breakfast,Friday,cheapgt,) ,
  • Find
  • A hypothesis h in H such that for all x in
    D x is covered by h ? c(x)

19
The inductive learning hypothesis
If a hypothesis approximates the target
function well over a sufficiently large number of
examples, then the hypothesis will also
approximate the target function well on other
unobserved examples.
20
Find-Sa naïve algorithm
Replace h by a minimal generalization of h
that covers x
Return h
21
Reaction example
Restaurant Meal Day Cost
Reaction Alma 3 breakfast Friday cheap Yes De
Moete lunch Friday expensive No Alma
3 lunch Saturday cheap Yes Sedes breakfast Su
nday cheap No Alma 3 breakfast Sunday expensiv
e No

-

-
-
no more positive examples return h
Generalization replace something by ?
minimal generalizations the individual events
22
Properties of Find-S
  • Non-deterministic
  • Depending on H, there maybe several minimal
    generalizations

23
Properties of Find-S (2)
  • May pick incorrect hypothesis (w.r.t. the
    negative examples)

24
Properties of Find-S (3)
  • Cannot detect inconsistency of the training data
  • Nor inability of the language H to learn the
    concept

25
Nice about Find-S
  • It doesnt have to remember previous examples !

If h already covered the 20 first examples, then
h will as well
26
Dual Find-S
Initialize h ?, ?, .., ? For each
negative training example x in D Do If h does
cover x Replace h by a minimal
specialization of h that does not cover
x Return h
27
Reaction example
28
Version Spaces the idea
  • BUT do NOT select 1 minimal
    generalization or specialization at each
    step, but keep track of ALL minimal
    generalizations or specializations

29
Version spacesinitialization
  • The version spaces G and S are initialized to be
    the smallest and the largest hypotheses only.

30
Negative examples
  • Replace the top hypothesis by ALL minimal
    specializations that DO NOT cover the negative
    example.

31
Positive examples
  • Replace the bottom hypothesis by ALL minimal
    generalizations that DO cover the positive
    example.

32
Later negative examples
  • Replace the all hypotheses in G that cover a next
    negative example by ALL minimal specializations
    that DO NOT cover the negative example.

33
Later positive examples
  • Replace the all hypotheses in S that do not cover
    a next positive example by ALL minimal
    generalizations that DO cover the example.

34
Optimization negative
are more general than ...
  • Only consider specializations of elements in G
    that are still more general than some specific
    hypothesis (in S)

35
Optimization positive
are more specific than ...
  • Only consider generalizations of elements in S
    that are still more specific than some general
    hypothesis (in G)

36
Pruning negative examples
  • The new negative example can also be used to
    prune all the S - hypotheses that cover the
    negative example.

37
Pruning positive examples
  • The new positive example can also be used to
    prune all the G - hypotheses that do not cover
    the positive example.

38
Eliminate redundant hypotheses
Obviously also for S !
More specific than another general hypothesis
  • If a hypothesis from G is more specific than
    another hypothesis from G eliminate it !

39
Convergence
  • Eventually, if G and S MAY get a common element
    Version Spaces has converged to a solution.
  • Remaining examples need to be verified for the
    solution.

40
Reaction example
  • Initialization

41
Alma3, breakfast, Friday, cheap
  • Positive example minimal generalization of ?

42
DeMoete, lunch, Friday, expensive -
  • Negative example minimal specialization of ?,
    ?, ?, ?
  • 15 possible specializations !!

43
Result after example 2
44
Alma3, lunch, Saturday, cheap
  • Positive example minimal generalization of
    Alma3, breakfast, Friday, cheap

45
Sedes, breakfast, Sunday, cheap -
  • Negative example minimal specialization of the
    general models

The only specialization that is introduced is
pruned, because it is more specific than
another general hypothesis
46
Alma 3, breakfast, Sunday, expensive -
  • Negative example minimal specialization of
    Alma3, ?, ?, ?

Cheap food at Alma3 produces the allergy !
47
Version Space Algorithm
Generalize all hypotheses in S that do not cover
the example yet, but ensure the following
- Only introduce minimal changes on the
hypotheses.
- Each new specific hypothesis is a
specialization of some general hypothesis.
- No new specific hypothesis is a
generalization of some other specific
hypothesis.
Prune away all hypotheses in G that do not
cover the example.
48
Version Space Algorithm (2)
Specialize all hypotheses in G that cover the
example, but ensure the following
- Only introduce minimal changes on the
hypotheses.
- Each new general hypothesis is a
generalization of some specific hypothesis.
- No new general hypothesis is a
specialization of some other general
hypothesis.
Prune away all hypotheses in S that cover the
example.
Until there are no more examples report S and G
OR S or G become empty report failure
49
Properties of VS
  • Symmetry
  • positive and negative examples are dealt with in
    a completely dual way.
  • Does not need to remember previous examples.

50
Termination
  • If it terminates because of no more examples
  • Example (spaces on termination)

Then all these hypotheses, and all intermediate
hypotheses, are still correct descriptions
covering the test data.
51
Termination (2)
  • If it terminates because of S or G being empty

52
Which example next?
  • VS can decide itself which example would be most
    useful next.
  • It can query a user for the most relevant
    additional classification !
  • Example

ltAlma3,lunch,Monday,expensivegt
53
Use of partially learned concepts
  • Example ltAlma3,lunch,Monday,cheapgt

54
Use of partially learned concepts (2)
  • Example ltSedes,lunch,Sunday,cheapgt

55
Use of partially learned concepts (3)
  • Example ltAlma3,lunch,Monday,expensivegt

56
Use of partially learned concepts (4)
  • Example ltSedes,lunch,Monday,expensivegt

57
The relevance of inductive BIAS choosing H
  • Our hypothesis language L fails to learn some
    concepts.
  • What about choosing a more expressive language H?
  • Assume H

58
Inductive BIAS (2)
  • This language H allows to represent ANY subset
    of the complete set of all events X
  • But X has 126 elements
  • we can express 2126 different hypotheses now !

59
Inductive BIAS (3)
  • Version Spaces using H

60
Inductive BIAS (3)
  • Resulting Version Spaces
  • We havent learned anything
  • merely restated our positive and negative
    examples !

61
Shift of Bias
  • Practical approach to the Bias problem
  • Avoids the choice.
  • Gives the most general concept that can be
    learned.
Write a Comment
User Comments (0)
About PowerShow.com