Learning Classifier Systems - PowerPoint PPT Presentation

1 / 32
About This Presentation
Title:

Learning Classifier Systems

Description:

Expert. Systems. Decision. Support. ENUMERATIVES. GUIDED. NON-GUIDED ... 6. The Computational Beauty of Nature: Computer Explorations of Fractals, Chaos, ... – PowerPoint PPT presentation

Number of Views:82
Avg rating:3.0/5.0
Slides: 33
Provided by: Willb46
Category:

less

Transcript and Presenter's Notes

Title: Learning Classifier Systems


1
Learning Classifier Systems
  • Navigating the fitness landscape?
  • Why use evolutionary computation?
  • Whats the concept of LCS?
  • Early pioneers
  • Competitive vs Grouped Classifiers
  • Beware the Swampy bits!
  • Niching
  • Selection for mating and effecting
  • Balance exploration with exploitation.
  • Balance the pressures
  • Zeroth level classifier system
  • The X-factor
  • Alphabet soup
  • New Slants - piecewise linear approximators
  • Why don't LCS rule the world?
  • Simplification schemes
  • Cognitive Classifiers
  • Neuroscience Inspirations
  • Application Domains

2
Welcome to LCS
  • Aims
  • To study and design Learning Classifier Systems,
    which are a modern Genetics-Based Machine
    Learning technique. The overall concept,
    selection of methods and important
    characteristics will be detailed. Insight will be
    given on how to tailor the technique to a wide
    range of application domains from data mining to
    artificial cognitive systems.
  • Assessable learning outcomes
  • An understanding of the concept and important
    methods in Learning Classifier Systems, so that
    they may be applied to practical problems
    involving feedback from the environment.
  • Additional outcomes
  • During the course each student will have a basic
    LCS, which they will develop as important methods
    and ideas are described. The final exam will be
    computer-based where the developed system will be
    used to solve a practical problem.

3
Evolutionary Algorithms
  • Competition between solutions is used to improve
    future solutions.
  • Random variation is used to search the domain.
  • Successive generations lead to the analogy of
    Evolution and Survival of the Fittest
  • Biological terms offer illustration, but often
    are not a direct connection!
  • Note
  • Parallel processing can be used, but this is not
    unique as Hill climbing methods can be made
    parallel.

4
Biological Illustration
  • Darwinian Learning the fit survive, whether they
    want to or not.
  • Baldwin effect That learning and other lifetime
    adaptations can effect the course of evolution
  • Lamarckian evolution lifetime characteristics
    can be transferred to offspring.
  • Therefore, can break natural laws!
  • One or multiple parents
  • Age, gender and mating can be controlled.

5
Biological Illustration
  • A solution is stored as a chromosome,
  • With part solutions known as alleles.
  • s1 (0,1,0,0,0,1,1,1) ? 01000111
  • Phenotype an individuals expressed behaviour.
  • Genotype an individuals genetic composition.
  • The same genotype may have different phenotypic
    behaviours, e.g.
  • 010011 ? 0 in multiplexer
  • 010011 ? 19 in binary
  • Similarly, same phenotype may have different
    genotypes

6
Steps to Evolution
  • Select a problem domain then
  • Create a population of individuals that represent
    potential solutions
  • Evaluate the individuals
  • Introduce some selective pressure to promote
    better individuals (or eliminate lesser quality
    individuals)
  • Apply some variation operators to generate new
    solutions
  • Repeat

7
Steps to Evolution
  • Procedure evolutionary algorithm
  • Begin
  • initialise P(t)
  • evaluate P(t)
  • while (not termination-condition) do
  • begin
  • select P(t) from P(t - 1)
  • alter P(t)
  • evaluate P(t)
  • end
  • End

8
Techniques
  • Genetic Algorithm,
  • Estimation of Distribution Algorithms,
  • Genetic Programming,
  • Evolutionary Programming,
  • Evolutionary Strategies,
  • Learning Classifier Systems
  • Artificial Immune systems
  • Ant algorithms
  • Gene expression programming

9
Theoretical Division of AI
KNOWLEDGE BASED
ENUMERATIVES
GUIDED
NON-GUIDED
Expert
Decision
Case Based
Backtracking
Branch
Dynamic
Systems
Support
Reasoning
Bound
Programming
INTELLIGENT AGENTS
(inc. Artificial Life)
FUZZY LOGIC
IMMUNE
CELLULAR
ANT
SYSTEMS
AUTOMATA
COLONY
LEARNING
GUIDED
NON-GUIDED
Tabu
Search
Las Vegas
Simulated
Annealing
GENETIC EVOLUTIONARY COMPUTATION
NEURAL NETWORKS
Hopfield
Kohonen
Multilayer
LCS
Maps
Perceptrons
GENETIC ALGORITHMS
GENETIC
PROGRAMMING
EVOLUTION STRATEGIES
PROGRAMMING
10
Solution History
  • EC
  • EP
  • ES
  • GA
  • LCS
  • GP

Evolutionary Computation Evolutionary
Programming (Fogal 1962) Evolutionary
Strategies (Rechenberg 1973) Broadcast
Language (Holland 1975) Learning Classifier
Systems (Holland, Reitman 1978) Genetic
Programming (Koza 1992)
11
Problem Solutions
  • EVOLUTION PROBLEM
  • ALGORITHMS SOLVING
  • Individual Candidate
  • Solution
  • Fitness Quality
  • Environment Problem

12
Evolution Mechanism
  • Parents

Selection
Recombination Mutation
Population
Replacement
Offspring
13
Evolution Mechanism
  • Increasing diversity by genetic operators
  • Recombination
  • Mutation
  • Decreasing diversity by selection
  • Parents
  • Replacement

14
Function Balance
  • GLOBAL v LOCAL
  • EXPLORE v EXPLOIT
  • TRAINING v TEST
  • GENERAL v SPECIFIC
  • ENUMERATED v HIERARCHICAL
  • EFFICIENTLY v EFFECTIVELY

15
Functionality
  • Search Optimisation
  • Modelling
  • Knowledge-handling
  • Routing Scheduling
  • Visualisation Design Querying Learning
  • Game-playing Adaptive-Control
  • Rule-Induction
  • Data-Access Data-Manipulation
  • Prediction Diagnosis
  • Classification

16
Domains of Applications
  • Numerical, Combinatorial Optimisation
  • System Modelling and Identification
  • Planning and Control
  • Engineering Design
  • Data Mining
  • Machine Learning
  • Artificial Life

17
Programme Languages
  • Assembler
  • C, C, C, Java and FORTRAN
  • Lisp, Small Talk and PROLOG
  • Shells, e.g., Clementine
  • Toolboxes, e.g., Neural Networks in Matlab.

18
Building blocks
  • Why do evolutionary algorithms work?
  • They are not proven!
  • theoretical assumptions do not relate to
    practical systems, e.g. infinite populations
  • The building block hypothesis attempts to explain
    their power
  • It relies on the concept of schemata
  • This approach gives useful insights, but it has
    flaws.

19
Building blocks
  • Short, low-order and highly fit schemata are
    sampled, recombined and resampled to form strings
    of potentially higher fitness
  • Goldberg 89
  • Searching using building blocks is a parallel
    operation, which can be very fast.

20
Building blocks
  • Schema divide up the search space
  • 1

21
Building blocks
  • Schema divide up the search space
  • 1

22
Multi-modal
  • The difference between multi-objective and
    multi-modal.
  • Multi-modal is where you have 1 (or more)
    objective, but many interesting solutions
  • (e.g. task 2 to identify the different bumps in
    the function).

23
Explore/Exploit
  • The explore/exploit conflict is prevalent in many
    evolutionary algorithms.
  • The more you exploit the information that you
    have learnt, the more likely you are to be stuck
    in local optima solutions or/and slow to find the
    global optimum.
  • The more you explore, the more likely you are
    never make use of the information that you have
    discovered.

24
Elitism
  • 'Elitism' is used where a certain percentage of
    the population (usually the elite 'best'
    solutions) are kept at each generation.
  • 20 solutions kept in each generation is a rough
    guide.
  • (mu, lamda) in Evolutionary strategies is also
    worth investigating.
  • µ is the total number of parents
  • ? is the total number of offspring

25
Fitness Function
  • Underlying assumption that a fitness landscape
    can be characterized in terms of variables, and
    that there is an optimum solution (or multiple
    such optima) in terms of those variables.

26
Fitness Function
The animation below illustrates three different
search methods applied to the two-dimensional
version of the bump problem. The red trace is a
hill climber based on local linearization
followed by a simplex algorithm. The yellow trace
is the classic Hooke and Jeeves search. The
purple trace is for a GA with 100 member
populations run for 10 generations and the white
points show all 1000 members from this search.
  • Underlying assumption that a fitness landscape
    can be characterized in terms of variables, and
    that there is an optimum solution (or multiple
    such optima) in terms of those variables.

27
Fitness Function
  • For example, if one were trying to find the
    shortest path in a Traveling Salesman Problem,
    each solution would be a path.
  • The length of the path could be expressed as a
    number, which would serve as the solution's
    fitness.
  • The fitness landscape for this problem could be
    characterized as a hypersurface proportional to
    the path lengths in a space of possible paths.
  • The goal would be to find the globally shortest
    path in that space, or more practically, to find
    very short tours very quickly.

28
Distance
  • Size of search space not determined by problem,
  • But by representation and how encoding is
    handled.

29
Fitness Function
  • 10 Elements
  • Time to reach reference signal
  • Avoidance of overshoot
  • Settling time
  • Variance in a reference signal
  • Robustness to change in ?
  • Robustness to change in K
  • Response to step input
  • Response to ramp input
  • Stability with extreme spiked signal
  • Frequency response
  • Fitness is sum of elements.

30
Multi-objective
  • Multi-objective is where you have multiple
    objectives, which may have one solution (Uni
    modal) or many solutions.
  • When you have multiple objectives and multiple
    interesting solutions, you often find that some
    solutions are dominated
  • (rough definition of dominated a similar
    solution exists where one variable is altered
    such that at least one objective is improved and
    no other objective worsened).
  • Non-dominated solutions (cannot change a variable
    without decreasing at least one objective
    function) exist and lie on the Pareto front.

Cost 2
Cost 1
31
Websites of the week
  • http//www.aaai.org/AITopics/html/genalg.html
  • http//www.ie.ncsu.edu/mirage/GAToolBox/gaot/
  • dont reply upon Wikipedia, but a good jumping
    off point!

32
Top-10 Evolutionary Computation Books
  • (Like any other chart, please nominate your own
    suggestions and vote for your favourites!).
  • 1. How to Solve It Modern Heuristics
  • Zbigniew Michalewicz, David B. FogelSpringer-Verl
    ag Berlin and Heidelberg GmbH Co. K Hardcover
    - August 2004 simple introduction to the field
  • Publisher Springer-Verlag Berlin and Heidelberg
    GmbH Co. K
  • ISBN 3540224947
  • 38 50
  • 2. Genetic Algorithms Data Structures
    Evolution Programs Hardcover Zbigniew
    Michalewicz a bit more advanced
  • Publisher Springer-Verlag Berlin and Heidelberg
    GmbH Co. K
  • ISBN 3540606769
  • 24 50
  • 3. An Introduction to Genetic Algorithms (Complex
    Adaptive Systems) Melanie Mitchell old but has
    stood the test of time
  • Publisher The MIT Press
  • ISBN 0262631857
  • 20 95
  • 4. Genetic Algorithms in Search, Optimization and
    Machine Learning David E. Goldberg much cited
    volume

33
Top-10 Evolutionary Computation Books
  • 6. The Computational Beauty of Nature Computer
    Explorations of Fractals, Chaos, Complex Systems
    and Adaptation (A Bradford Book) William Gary
    Flake general, but interesting
  • Publisher MIT Press
  • ISBN 0262062003 40.99
  • 7. Foundations of Genetic Programming W.B.
    Langdon, R. Poli
  • -The LP Genetic Programming Routine
    Human-Competitive Machine Intelligence (Genetic
    Programming) John R. Koza
  • -The disco version Genetic Programming An
    Introduction Hardcover Wolfgang Banzhaf
  • Publisher Morgan Kaufmann
  • ISBN 155860510X 31 49
  • 8. The Theory of Evolution Strategies
    Hans-Georg Beyer
  • for those interested in evolutionary strategies
  • Publisher Springer-Verlag (February 15, 2001)
  • ISBN 3540672974 51 85
  • 9. Handbook of Genetic Algorithms
  • Lawrence Davis (Editor) old, but worth finding
    a copy
  • Publisher Van Nostrand Reinhold (January 1,
    1991)
  • ISBN 0442001738 47 60
Write a Comment
User Comments (0)
About PowerShow.com