Genetic Algorithms: Big, Bad - PowerPoint PPT Presentation

1 / 26
About This Presentation
Title:

Genetic Algorithms: Big, Bad

Description:

Simple fitness inheritance and substructural inheritance or variable structure ... More Information. Web site: http://www-illigal.ge.uiuc.edu/ Goldberg, D. E. (2002) ... – PowerPoint PPT presentation

Number of Views:148
Avg rating:3.0/5.0
Slides: 27
Provided by: davideg2
Category:

less

Transcript and Presenter's Notes

Title: Genetic Algorithms: Big, Bad


1
Genetic Algorithms Big, Bad Fast
  • David E. Goldberg
  • Illinois Genetic Algorithms Laboratory
  • University of Illinois at Urbana-Champaign
  • Urbana, IL 61801
  • deg_at_uiuc.edu

2
Evolution Timeless, GAs so 90s!!
  • GAs had their Warhol 15 in the 90s.
  • First-generation results were mixed.
  • Sometimes GAs worked and sometimes they didnt.
  • Little rhyme or reason.
  • New generation of GAs can solve large, hard
    problems quickly, reliably, and accurately
  • New push for solving big hard problems.

3
Roadmap
  • From competence to efficiency.
  • Background facetwise theory.
  • Competent GA design, then and now.
  • Two recent applications.
  • Simple fitness inheritance and substructural
    inheritance or variable structure endogenous
    fitness.
  • Supermultiplicative speedups through extreme
    integration.
  • Race to a billion.

4
Competent Efficient GAs
  • Competence solve hard problems, quickly,
    reliably, and accurately (intractable to
    tractable).
  • Efficiency Speedups that move us from tractable
    to practical (parallel, time continuation,
    hybridization, evaluation relaxation).
  • Principled design for competence/efficiency
  • Use problem decomposition.
  • Facetwise models.
  • Patchquilt integration using dimensional
    analysis.

5
GA Design Decomposition
  • Solutions are possible because of tractable
    design theory
  • Understand building blocks (BBs).
  • Ensure BB supply.
  • Ensure BB growth.
  • Control BB speed.
  • Ensure good BB decisions.
  • Ensure good BB mixing (exchange).
  • Know BB challengers.
  • No one has ever proven that an airplane can fly.

6
Population Sizing Controls Quality
Harik, Cantu-Paz, Goldberg, Miller, 1997.
7
Control Maps Guide Parameter Choice
  • Easy problems are no problem.
  • GA has a large sweet spot.
  • A monkey can set cross probability selection
    pressure.

Goldberg, Deb, Theirens, 1993
8
Simple GAs Are Mixing Limited
  • With growing difficulty, sweet spot vanishes.
  • Or populations must grow exponentially.

Thierens Goldberg, 1993
9
Competent GAs Then
  • 1993 the fast messy GA.
  • Original mGAcomplexityestimated O(l5).
  • Compares favorably to hillclimbing, too
    (Muhlenbein 1992).

Goldberg, Deb, Kargupta, Harik, 1993
10
Competent GAs Now hBOA
  • Perspective selection population genetic
    operators probability distribution over best
    points.
  • Replace genetics with probabilistic model
    building PMBGA or EDA.
  • 3 main elements
  • Decomposition (structural learning)
  • Learn what to mix and what to keep intact.
  • Representation of BBs (chunking)
  • Means of representing alternative solutions.
  • Diversification of BBs (niching)
  • Preserve alternative chunks of solutions.

11
Outline of BOA Structure
12
Results on Spin Glasses
  • Testing on adversarially designed test functions.
  • hBOA works as well as tailored heuristics.
  • Polynomial (subcubic) convergence.

Pelikan et al. (2002)
13
Results on Antenna Systems
14
hBOA Beats sGA in Constrained Feed Network Design
Santarelli, Yu, Goldberg, 2005
15
GP in Materials Modeling
  • Cost-effective simulation methods
  • Simulate from picoseconds to several seconds.
  • Molecular dynamics (MD) nanoseconds.
  • Many realistic processes are inaccessible.
  • Kinetic Monte Carlo (KMC) seconds
  • Infeasible to compute all jump frequencies a
    priori.
  • Existing methods fall 36 orders short.
  • Efficient hybrids of MD KMC.
  • Effective practical multi-timescale modeling.

16
Genetic Programming (GP)
17
Tailor-made Statistical Mechanics
  • Use PES predicted by GP in kinetic Monte Carlo.
  • Real time in KMC (Fichthorn Weinberg, 1991).
  • Speed-up over MD
  • 109 at 300 K
  • 105 at 550 K
  • 103 at 900 K
  • Less CPU time over MD.

18
Surrogate Fitness Models
  • Taking samples. Why not use to build and fit
    internal fitness models?
  • Fitness inheritance Smith et al, 1994.
  • Evaluate entire initial population.
  • Choose inheritance proportion, pi.
  • After that
  • Estimate fitness of the pi proportion of
    offspring during crossover.
  • Each offspring receives average (or weighted
    average) fitness of the parents fitness.
  • Evaluate (1-pi) proportion of offspring.

19
Modeling Simple Fitness Inheritance
  • Optimal inheritance proportion
  • Maximum speed-up
  • Similar results multiobjective GAs Chen et al
    2002 Bui et al 2005

Sastry, Pelikan Goldberg, 2001
20
Endogenous Substructural Fitness Model
  • Identify key sub-structures of the search
    problem.
  • Estimate the fitness of sub-structure instances
  • Individual fitness as a function of
    sub-structural fitness values
  • Sum of fitness estimates of sub-structure
    instances.
  • Can use other complex methods.

21
Extending BNs With Fitness Info
  • Basic idea
  • Dont work only with conditional probabilities.
  • Add fitness info for fitness estimation.
  • Fitness info attached to p(XPx) denoted by
    f(XPx)
  • Contribution of X restricted by Px

Avg. fitness of solutions with Px
Avg. fitness of solutions with Xx and Pxpx
Pelikan, Sastry, Goldberg, 2004
22
Speedups are Significant
  • Speed-Up Ratio of function evaluations without
    efficiency enhancement to that with it.
  • Only 1-15 individuals need evaluation.
  • Speed-Up 3053.
  • Have decision tree ECGA versions.

Fitness modeling in BOA
23
Extreme Integration of Structural Learning
  • Naïve view
  • Build a competent GA.
  • Achieve efficiency enhancement.
  • Get multiplicative speedup.
  • Current view
  • Extreme integration of structural learning
    throughout algorithm.
  • Yields supermultiplicative speedups.
  • Can be extended to parallelism, time continuation
    hybrids, too.

24
The Race to a Billion Bits
  • Currently can do hard problems up to 1000-10000
    bits to global optimality.
  • Intelligent design community uses as evidence of
    ineffectiveness of evolution.
  • New results should make principled scale up to a
    billion bits straightforward.
  • Need integrated model building, surrogates,
    parallelism, hybrids, and time continuation.
  • The race is on.
  • Can carry over to other problem types, too.

25
Summary Conclusions
  • Not your grandmothers GA.
  • Increasingly solving large, hard problems of
    practical interest.
  • Extreme integration of model building and
    efficiency a key
  • to a world of routine billion bit solutions.

26
More Information
  • Web site http//www-illigal.ge.uiuc.edu/
  • Goldberg, D. E. (2002). The design of innovation
    Lessons from and for competent genetic
    algorithms. Boston, MA Kluwer Academic
    Publishers.
  • http//www-doi.ge.uiuc.edu/
  • Consult book for details.
Write a Comment
User Comments (0)
About PowerShow.com