Title: Stochastic Local Search Variants
1Stochastic Local Search Variants Computer Science
cpsc322, Lecture 16 (Textbook Chpt
4.8) February, 11, 2008
2Lecture Overview
3Stochastic Local Search
- Idea combine hill climbing (advantage finds
local maximum) with randomization (advantage
doesn't get stuck). - As well as uphill steps we can allow a small
probability of - Random steps move to a random neighbor.
- Random restart reassign random values to all
variables.
4Runtime Distribution
- Plots runtime (or number of steps) and the
proportion (or number) of the runs that are
solved within that runtime. - note the use of a log scale on the x axis
5Tabu lists
- SLS algorithms can get stuck in plateaus
- To prevent cycling we can maintain a tabu list of
the k last nodes visited. - Don't visit a node that is already on the tabu
list. - If k1, we don't allow the search to visit the
same assignment twice in a row. - This method can be expensive if k is large.
6Lecture Overview
- Recap SLS
- SLS variants
- Simulated Annealing
- Beam search
- Genetic Algorithms
7Simulated Annealing
- Annealing a metallurgical process where metals
are hardened by being slowly cooled. - Analogy start with a high temperature'' a
high tendency to take random steps - Over time, cool down more likely to follow the
scoring function - Temperature reduces over time, according to an
annealing schedule
8Simulated Annealing algorithm
- Here's how it works (for maximizing)
- You are in node n. Pick a variable at random and
a new value at random. You generate n' - If it is an improvement i.e.,
, adopt it. - If it isn't an improvement, adopt it
probabilistically depending on the difference and
a temperature parameter, T. - we move to n' with probability e(h(n')-h(n))/T
9Properties of simulated annealing search
- One can prove If T decreases slowly enough, then
simulated annealing search will find a global
optimum with probability approaching 1 - Widely used in VLSI layout, airline scheduling,
etc.
10Lecture Overview
- Recap SLS
- SLS variants
- Simulated Annealing
- Population Based
- Beam search
- Genetic Algorithms
11Population Based SLS
- Often we have more memory than the one required
for current node ( tabu list) - Key Idea maintain a population of k nodes
- At every stage, update your population.
- Whenever one node is a solution, report it.
Simplest strategy Parallel Search
- All searches are independent
- Like k restarts, but uses k times the minimum
number of steps.
12Population Based SLS Beam Search
- Non Stochastic
- Like parallel search, with k nodes, but you
choose the k best out of all of the neighbors. - Useful information is passed among the k parallel
search thread - Extreme case If one successor generates several
good neighbors and the other k-1 all generate bad
successors.
13Population Based SLS Stochastic Beam Search
- Deterministic Beam Search can suffer from lack of
diversity among the k states (just a more
expensive hill climbing) - Stochastic version alleviates this problem
- Selects the k nodes at random
- But probability of selection proportional to
their value
14Stochastic Beam Search Advantages
- It maintains diversity amongst the nodes.
- Biological metaphor
- The scoring function value reflects the fitness
of the node. - like asexual reproduction
- each node gives its mutations
- the higher the fitness the more likely the
individual will survive.
15Lecture Overview
- Recap SLS
- SLS variants
- Simulated Annealing
- Population Based
- Beam search
- Genetic Algorithms
16Population Based SLS Genetic Algorithms
- Start with k randomly generated states
(population)) - A state is represented as a string over a finite
alphabet (often a string of 0s and 1s) - A successor state is generated by combining two
parent states (loosely analogous to how DNA is
spliced in sexual reproduction - Evaluation function (fitness function). Higher
values for better states. - Produce the next generation of states by
selection, crossover, and mutation
17Genetic algorithms Example
- Representation and fitness function
State string over finite alphabet
Fitness function higher value better states
18Genetic algorithms Example
Selection common strategy, probability of being
chosen for reproduction is directly proportional
to fitness score
- 24/(24232011) 31
- 23/(24232011) 29 etc
19Genetic algorithms Example
Reproduction cross-over and mutation
20Genetic Algorithms Conclusions
- Their performance is very sensitive to the choice
of state representation and fitness function - Extremely slow (not surprising as they are
inspired by evolution!)
21CSPs summary
- Find a single variable assignment that satisfies
all of our constraints (atemporal) - Systematic Search approach (search space ..?)
- Constraint network support
- inference e.g., Arc Consistency (can tell you if
solution does not exist) - Decomposition
- Heuristic Search (degree, min-remaining)
- (Stochastic) Local Search (search space ..?)
- Huge search spaces and highly connected
constraint network but solutions densely
distributed - No guarantee to find a solution (if one exists).
- Unable to show that no solution exists
22What is coming next?
- How to select and organize a sequence of actions
to achieve a given goal - Relying on the powerful representation of states
as a set of features. (like CSPs) - Relying on sophisticated actions representation
(unlike CSPs)
23Modules we'll cover in this course
Stochastic
Deterministic
Search
Single Action
Constraint Satisfaction (CSPs)
Decision
Logics
Search
Sequence of Actions
Constraint Satisfaction (CSPs)
Planning
24Next class
Start Planning (Chp 11)