Title: Global Optimization: Visualizing Heuristic Strategies
1Global Optimization Visualizing Heuristic
Strategies
- Rob Dimeo
- IDL/DAVE Lunchtime Seminar
- December 14, 2004
2Conventional Optimization Algorithms
- Minimization methods require you choose between
- one that requires only function evaluations
- or one that requires function evaluations and
derivatives
For functions that have multiple minima these
algorithms can get caught in one of the local
minima
3Heuristic Global Optimization Algorithms
- Many algorithms borrow from a natural paradigm
- Simulated Annealing
- Genetic Algorithm
- Particle Swarm Optimization
- Ant Colony Optimization
- Some are artificial constructs or ad-hoc
- Stochastic tree optimization
- Stochastic downhill simplex
A common problem with global optimization
algorithms Exploration vs. exploitation
4The Simple Genetic Algorithm
Creation of initial population
- Based on Darwinian survival-of-the-fittest
- Search space encoded as chromosomes made of bits
- Solutions are bred using rules for reproduction,
crossover, and mutation - Population of solutions evolve for some number of
generations (undergoing reproduction, crossover,
and mutation) and the best fit solution is
determined in the last generation - Example Solution of the 1-d Ising model
Fitness (function) evaluation
Termination criteria met?
yes
Done
no
Selection
Crossover
Mutation
Determination of new population
5Stochastic Downhill Simplex
- Standard downhill simplex (reflections,
expansions, and contractions) augmented with a
random restart - At conclusion of a series of downhill simplex
moves, the simplex is exploded and another
series of downhill simplex moves proceeds - Example function minimization
6Stochastic Tree Search
- Start at the original node
- Create branches to 2 new nodes
- Evaluate the function at each new node (save the
minimum). Assign branch probabilities based on
function evaluation - In the next iteration begin at the original node
- Choose branch based on branch probability until
you reach a terminal node - At the terminal node create 2 new nodes
- Repeat until some depth has been reached
- Enhancement Specify a branch decay that reduces
the probability based on the number of times that
node has been visited
- One point in search space defines a node
- From starting node branches are constructed to
two new nodes - Function (c2) evaluated and stored at each new
node - Thickness of branch is related to the measure of
probability for taking that branch in subsequent
iterations
Example Function minimization