Evolutionary Computational Intelligence - PowerPoint PPT Presentation

1 / 31
About This Presentation
Title:

Evolutionary Computational Intelligence

Description:

Real valued GAs and ES. 1. Evolutionary Computational Intelligence. Lecture 4: ... Two learning rate parameters: ' overall learning rate. coordinate wise learning rate ... – PowerPoint PPT presentation

Number of Views:53
Avg rating:3.0/5.0
Slides: 32
Provided by: Pol77
Category:

less

Transcript and Presenter's Notes

Title: Evolutionary Computational Intelligence


1
Evolutionary Computational Intelligence
  • Lecture 4
  • Real valued GAs and ES

Ferrante Neri University of Jyväskylä
2
Real valued problems
  • Many problems occur as real valued problems, e.g.
    continuous parameter optimisation f ? n ? ?
  • Illustration Ackleys function (often used in
    EC)

3
Mapping real values on bit strings
  • z ? x,y ? ? represented by a1,,aL ? 0,1L
  • x,y ? 0,1L must be invertible (one phenotype
    per genotype)
  • ? 0,1L ? x,y defines the representation
  • Only 2L values out of infinite are represented
  • L determines possible maximum precision of
    solution
  • High precision ? long chromosomes (slow evolution)

4
Floating point mutations 1
  • General scheme of floating point mutations
  • Uniform mutation
  • Analogous to bit-flipping (binary) or random
    resetting (integers)

5
Floating point mutations 2
  • Non-uniform mutations
  • Many methods proposed,such as time-varying range
    of change etc.
  • Most schemes are probabilistic but usually only
    make a small change to value
  • Most common method is to add random deviate to
    each variable separately, taken from N(0, ?)
    Gaussian distribution and then curtail to range
  • Standard deviation ? controls amount of change
    (2/3 of deviations will lie in range (- ? to ?)

6
Crossover operators for real valued GAs
  • Discrete
  • each allele value in offspring z comes from one
    of its parents (x,y) with equal probability zi
    xi or yi
  • Could use n-point or uniform
  • Intermediate
  • exploits idea of creating children between
    parents (hence a.k.a. arithmetic recombination)
  • zi ? xi (1 - ?) yi where ? 0 ? ? ? 1.
  • The parameter ? can be
  • constant uniform arithmetical crossover
  • variable (e.g. depend on the age of the
    population)
  • picked at random every time

7
Single arithmetic crossover
  • Parents ?x1,,xn ? and ?y1,,yn?
  • Pick a single gene (k) at random,
  • child1 is
  • reverse for other child. e.g. with ? 0.5

8
Simple arithmetic crossover
  • Parents ?x1,,xn ? and ?y1,,yn?
  • Pick random gene (k) after this point mix values
  • child1 is
  • reverse for other child. e.g. with ? 0.5

9
Whole arithmetic crossover
  • Parents ?x1,,xn ? and ?y1,,yn?
  • child1 is
  • reverse for other child. e.g. with ? 0.5

10
Box crossover
  • Parents ?x1,,xn ? and ?y1,,yn?
  • child1 is
  • Where a is a VECTOR of numers 0,1

11
Comparison of the crossovers
  • Arithmetic crossover works on a line which
    connects the two parents
  • Box crossover works in a hyper-rectangular where
    the two parents are located in the vertexes

12
Evolution Strategies
Ferrante Neri University of Jyväskylä
13
ES quick overview
  • Developed Germany in the 1970s
  • Early names I. Rechenberg, H.-P. Schwefel
  • Typically applied to
  • numerical optimisation
  • Attributed features
  • fast
  • good optimizer for real-valued optimisation
  • relatively much theory
  • Special
  • self-adaptation of (mutation) parameters standard

14
ES technical summary tableau
15
Representation
  • Chromosomes consist of three parts
  • Object variables x1,,xn
  • Strategy parameters
  • Mutation step sizes ?1,,?n?
  • Rotation angles ?1,, ?n?
  • Not every component is always present
  • Full size ? x1,,xn, ?1,,?n ,?1,, ?k ?
  • where k n(n-1)/2 (no. of i,j pairs)

16
Parent selection
  • Parents are selected by uniform random
    distribution whenever an operator needs one/some
  • Thus ES parent selection is unbiased - every
    individual has the same probability to be
    selected
  • Note that in ES parent means a population
    member (in GAs a population member selected to
    undergo variation)

17
Mutation
  • Main mechanism changing value by adding random
    noise drawn from normal distribution
  • xi xi N(0,?)
  • Key idea
  • ? is part of the chromosome ? x1,,xn, ? ?
  • ? is also mutated into ? (see later how)
  • Thus mutation step size ? is coevolving with the
    solution x

18
Mutate ? first
  • Net mutation effect ? x, ? ? ? ? x, ? ?
  • Order is important
  • first ? ? ? (see later how)
  • then x ? x x N(0,?)
  • Rationale new ? x ,? ? is evaluated twice
  • Primary x is good if f(x) is good
  • Secondary ? is good if the x it created is
    good
  • Reversing mutation order this would not work

19
Mutation case 0 1/5 success rule
  • z values drawn from normal distribution N(?,?)
  • mean ? is set to 0
  • variation ? is called mutation step size
  • ? is varied on the fly by the 1/5 success rule
  • This rule resets ? after every k iterations by
  • ? ? / c if ps gt 1/5
  • ? ? c if ps lt 1/5
  • ? ? if ps 1/5
  • where ps is the of successful mutations, 0.8 ?
    c lt 1

20
Mutation case 1Uncorrelated mutation with one ?
  • Chromosomes ? x1,,xn, ? ?
  • ? ? exp(? N(0,1))
  • xi xi ? N(0,1)
  • Typically the learning rate ? ? 1/ n½
  • And we have a boundary rule ? lt ?0 ? ? ?0

21
Mutants with equal likelihood
  • Circle mutants having the same chance to be
    created

22
Mutation case 2Uncorrelated mutation with n ?s
  • Chromosomes ? x1,,xn, ?1,, ?n ?
  • ?i ?i exp(? N(0,1) ? Ni (0,1))
  • xi xi ?i Ni (0,1)
  • Two learning rate parameters
  • ? overall learning rate
  • ? coordinate wise learning rate
  • ? ? 1/(2 n)½ and ? ? 1/(2 n½) ½
  • And ?i lt ?0 ? ?i ?0

23
Mutants with equal likelihood
  • Ellipse mutants having the same chance to be
    created

24
Mutation case 3Correlated mutations
  • Chromosomes ? x1,,xn, ?1,, ?n ,?1,, ?k?
  • where k n (n-1)/2
  • and the covariance matrix C is defined as
  • cii ?i2
  • cij 0 if i and j are not correlated
  • cij ½ ( ?i2 - ?j2 ) tan(2 ?ij) if i and
    j are correlated
  • Note the numbering / indices of the ?s

25
Correlated mutations
  • The mutation mechanism is then
  • ?i ?i exp(? N(0,1) ? Ni (0,1))
  • ?j ?j ? N (0,1)
  • x x N(0,C)
  • x stands for the vector ? x1,,xn ?
  • C is the covariance matrix C after mutation of
    the ? values
  • ? ? 1/(2 n)½ and ? ? 1/(2 n½) ½ and ? ? 5
  • ?i lt ?0 ? ?i ?0 and
  • ?j gt ? ? ?j ?j - 2 ? sign(?j)

26
Mutants with equal likelihood
  • Ellipse mutants having the same chance to be
    created

27
Recombination
  • Creates one child
  • Acts per variable / position by either
  • Averaging parental values, or
  • Selecting one of the parental values
  • From two or more parents by either
  • Using two selected parents to make a child
  • Selecting two parents for each position anew

28
Names of recombinations
29
Survivor selection
  • Applied after creating ? children from the ?
    parents by mutation and recombination
  • Deterministically chops off the bad stuff
  • Basis of selection is either
  • The set of children only (?,?)-selection
  • The set of parents and children (??)-selection

30
Survivor selection
  • (??)-selection is an elitist strategy
  • (?,?)-selection can forget
  • Often (?,?)-selection is preferred for
  • Better in leaving local optima
  • Better in following moving optima
  • Using the strategy bad ? values can survive in
    ?x,?? too long if their host x is very fit
  • Selection pressure in ES is very high

31
Survivor Selection
  • On the other hand, (?,?)-selection can lead to
    the loss of genotypic information
  • The (??)-selection can be preferred when are
    looking for marginal enhancements
Write a Comment
User Comments (0)
About PowerShow.com