Title: CAP6938 Neuroevolution and Developmental Encoding Evolutionary Computation Theory
1CAP6938Neuroevolution and Developmental
EncodingEvolutionary Computation Theory
- Dr. Kenneth Stanley
- September 13, 2006
2Schema Theory (Holland 1975)
- A building block is a set of genes with good
values - Schemas are a formalization of buildings blocks
- Schemas are bit strings with s (wildcards)
- 10 is all 6-bit strings surrounded by 1 and 0
- Order 2 2 defined bits
- A schema defines a hyperplane
- Example 1
1
3Schema Fitness
- GA implicitly evaluates fitness for all its
schemas - Average fitness of a schema is average fitness of
all possible instances of it - A GA behaves as if it were really storing these
averages
4Schema Theorem on Selection
- Idea Calculate approximate dynamics of increase
and decreases of schema instances - Instances of H at time t
- Observed avg. fitness of H at time t
- Goal Calculate
- Using the fact that number of offspring is
proportional to fitness - Thus, increases or decreases in instances depends
on schema average fitness
5Schema Theorem with Crossoverand Mutation
- Question is what the probability is that schema H
will survive a crossover or mutation - Let d(H) be Hs defining length
- Probability that schema H will survive crossover
- Equation shows its higher for shorter schemas
- Probability of surviving mutation
6Total Schema Theorem
- The expected number of instances of schema H
taking into account selection, crossover, and
mutation - Meaning Low-order schemas whose average fitness
remains above the mean will increase
exponentially. - Reason Increase of non-disrupted schema
proportional to
7Building Blocks Hypothesis(Goldberg 1989)
- Crossover combines good schema into equally good
or better higher-order schema - That is, crossover (or mutation) is not just
destructive It is a power behind the GA
8Questioning the BBH
- Why would separately discovered building blocks
be compatible? - What about speciation?
- Hybridization is rare in nature
- Gradual elaboration is safer
- Schema Theorem and BBH
- assume fixed length genomes
9No Free Lunch Theorem (Wolpert and Macready 1996)
- An attack on GAs and black box optimization
- Across all possible problems, no optimization
method is better than any other - Elevated performance over one class of problems
is exactly paid for in performance over another
class. - Implication Your method is not the best
- Or is it?
10Hill Climbing vs. Hill Descending
- Isnt hill climbing better overall? No
11Very Bad News
- If an algorithm performs better than random
search on some class of problems then it must
perform worse than random search on the remaining
problems. - One should be weary of trying to generalize
previously obtained results to other problems. - If the practitioner has knowledge of problem
characteristics but does not incorporate them
into the optimization algorithmthere are no
formal assurances that the algorithm chosen will
be at all effective.
12Hope is not Lost
- An algorithm can be better over a class of
problems if it exploits a common property of that
class - What is the class of problems known as the real
world? - Characterizing a class has become important
13Function Approximation is a Subclass of
Optimization
- A function approximator can be estimated
- Estimation means described in fewer dimensions
(parameters) than the final solution - That is not true of optimization in general
- f(x) can be as simple or as complex as we want
- There may be a limit on the of bits in f(x),
i.e. the size of the memory, but we can use them
however we want
14Exploiting Approximation
- How can the structure of approximation problems
be exploited? - Start with simple approximations
- Complexify them gradually
- Information about a function can be elaborated
- Information is not accumulated in general
optimization - Neural networks are approximators
- Real world problems are often approximation
problems
15Next Week Neuroevolution (NE)
- Combining EC with neural networks
- Fixed-topology NE and TWEANNs
- The Competing Conventions Problem
Genetic Algorithms and Neural Networks by Darrell
Whitley (1995) Evolving Artificial Neural
Networks by Xin Yao (1999)Genetic set
recombination and its application to neural
network topology optimisation by Radcliffe,
N. J. (1993). (Skim from section 4 on, except
for 9.2)