Informed Search Methods - PowerPoint PPT Presentation

About This Presentation
Title:

Informed Search Methods

Description:

Solves 1,000,000 queen problem quickly. Useful for scheduling. Useful for BSAT. solves (sometimes) large problems. More time, better answer. No memory problems ... – PowerPoint PPT presentation

Number of Views:21
Avg rating:3.0/5.0
Slides: 29
Provided by: ics5
Learn more at: https://ics.uci.edu
Category:

less

Transcript and Presenter's Notes

Title: Informed Search Methods


1
Informed Search Methods
  • Read Chapter 4
  • Use text for more Examples
  • work them out yourself

2
Best First
  • Store is replaced by sorted data structure
  • Knowledge added by the sort function
  • No guarantees yet depends on qualities of the
    evaluation function
  • Uniform Cost with user supplied evaluation
    function.

3
Concerns
  • What knowledge is available?
  • How can it be added to the search?
  • What guarantees are there?
  • Time
  • Space

4
Greedy Search
  • Adding heuristic h(n)
  • h(n) estimated cost of cheapest solution from
    state n to the goal
  • Require h(goal) 0.
  • Complete no can be mislead.

5
Examples
  • Route Finding goal from A to B
  • straight-line distance from current to B
  • 8-tile puzzle
  • number of misplaced tiles
  • number and distance of misplaced tiles

6
A
  • Combines greedy and Uniform cost
  • f(n) g(n)h(n) where
  • g(n) path cost to node n
  • h(n) estimated cost to goal
  • If h(n) lt true cost to goal, then admissible.
  • Best-first using admissible f is A.
  • Theorem A is optimal and complete

7
A optimality Proof
  • Note Along any path from root, f increases.
  • Definition of monotonicity.
  • Let f be cost of optimal solution.
  • A expands all nodes with f(n) ltf
  • A may expand nodes for which f(n) f
  • Let G be optimal goal state and G2 a suboptimal
    one.

8
A Proof
  • Let n be leaf node on path to G.
  • h admissible gt fgt f(n)
  • G2 choosen before n gt f(n)gtf(G2)
  • Then G2 is not suboptimal.
  • A is complete. Searches increasing contours.
  • A is exponential in time and space, generally.

9
A Properties
  • Dechter and Pearl A optimal among all
    algorithms using h. (Any algorithm must search at
    least as many nodes).
  • If 0lth1 lt h2 and h2 is admissible, then h1 is
    admissible and h1 will search at least as many
    nodes as h2. So bigger is better.
  • Sub exponential if h estimate error is within
    (approximately) log of true cost.

10
A special cases
  • Suppose h(n) 0. gt Uniform Cost
  • Suppose g(n) 1, h(n) 0 gt Breadth First
  • If non-admissible heuristic
  • g(n) 0, h(n) 1/depth gt depth first
  • One code, many algorithms

11
Heuristic Generation
  • Relaxation make the problem simpler
  • Route-Planning
  • dont worry about paths go straight
  • 8-tile puzzle
  • dont worry about physical constraints pick up
    tile and move to correct position
  • better allow sliding over existing tiles
  • Should be easy to compute

12
Iterative Deepening A
  • Like iterative deepening, but
  • Replaces depth limit with f-cost
  • Increase f-cost by smallest operator cost.
  • Complete and optimal

13
SMA
  • Memory Bounded version due to authors
  • Beware authors.

14
Hill-climbing
  • Goal Optimizing an objective function.
  • Does not require differentiable functions
  • Can be applied to goal predicate type of
    problems.
  • BSAT with objective function number of clauses
    satisfied.
  • Intuition Always move to a better state

15
Some Hill-Climbing Algos
  • Start random state or special state.
  • Until (no improvement)
  • Steepest Ascent find best successor
  • OR (greedy) select first improving successor
  • Go to that successor
  • Repeat the above process some number of times
    (Restarts).
  • Can be done with partial solutions or full
    solutions.

16
Hill-climbing Algorithm
  • In Best-first, replace storage by single node
  • Works if single hill
  • Use restarts if multiple hills
  • Problems
  • finds local maximum, not global
  • plateaux large flat regions (happens in BSAT)
  • ridges fast up ridge, slow on ridge
  • Not complete, not optimal
  • No memory problems ?

17
Beam
  • Mix of hill-climbing and best first
  • Storage is a cache of best K states
  • Solves storage problem, but
  • Not optimal, not complete

18
Local (Iterative) Improving
  • Initial state full candidate solution
  • Greedy hill-climbing
  • if up, do it
  • if flat, probabilistically decide to accept move
  • if down, dont do it
  • We are gradually expanding the possible moves.

19
Local Improving Performance
  • Solves 1,000,000 queen problem quickly
  • Useful for scheduling
  • Useful for BSAT
  • solves (sometimes) large problems
  • More time, better answer
  • No memory problems
  • No guarantees of anything

20
Simulated Annealing
  • Like hill-climbing, but probabilistically allows
    down moves, controlled by current temperature and
    how bad move is.
  • Let t1, t2, be a temperature schedule.
  • usually t1 is high, tk 0.9tk-1.
  • Let E be quality measure of state
  • Goal maximize E.

21
Simulated Annealing Algorithm
  • Current random state, k 1
  • If Tk 0, stop.
  • Next random next state
  • If Next is better than start, move there.
  • If Next is worse
  • Let Delta E(next)-E(current)
  • Move to next with probabilty e(Delta/Tk)
  • k k1

22
Simulated Annealing Discussion
  • No guarantees
  • When T is large, edelta/t is close to e0, or 1.
    So for large T, you go anywhere.
  • When T is small, edelta/t is close to e-inf, or
    0. So you avoid most bad moves.
  • After T becomes 0, one often does simple
    hill-climbing.
  • Execution time depends on schedule memory use is
    trivial.

23
Genetic Algorithm
  • Weakly analogous to evolution
  • No theoretic guarantees
  • Applies to nearly any problem.
  • Population set of individuals
  • Fitness function on individuals
  • Mutation operator new individual from old one.
  • Cross-over new individuals from parents

24
GA Algorithm (a version)
  • Population random set of n individuals
  • Probabilistically choose n pairs of individuals
    to mate
  • Probabilistically choose n descendants for next
    generation (may include parents or not)
  • Probability depends on fitness function as in
    simulated annealing.
  • How well does it work? Good question ?

25
Scores to Probabilities
  • Suppose the scores of the n individuals are
  • a1, a2,.an.
  • The probability of choosing the jth individual
  • prob aj/(a1a2.an).

26
GA Example
  • Problem Boolean Satisfiability.
  • Individual bindings for variables
  • Mutation flip a variable
  • Cross-over For 2 parents, randomly positions
    from 1 parent. For one son take those bindings
    and use other parent for others.
  • Fitness number of clauses solved.

27
GA Example
  • N-queens problem
  • Individual array indicating column where ith
    queen is assigned.
  • Mating Cross-over
  • Fitness (minimize) number of constraint
    violations

28
GA Discussion
  • Reported to work well on some problems.
  • Typically not compared with other approaches,
    e.g. hill-climbing with restarts.
  • Opinion Works if the mating operator captures
    good substructures.
  • Any ideas for GA on TSP?
Write a Comment
User Comments (0)
About PowerShow.com