ESI 6448 Discrete Optimization Theory - PowerPoint PPT Presentation

1 / 28
About This Presentation
Title:

ESI 6448 Discrete Optimization Theory

Description:

IP column generation for 0-1 IP. Implicit partitioning/packing problems ... the instance is large/complicated hard to formulate it as (M)IP of reasonable size ... – PowerPoint PPT presentation

Number of Views:54
Avg rating:3.0/5.0
Slides: 29
Provided by: Min26
Category:

less

Transcript and Presenter's Notes

Title: ESI 6448 Discrete Optimization Theory


1
ESI 6448Discrete Optimization Theory
  • Lecture 32

2
Last class
  • Strength of LPM
  • IP column generation for 0-1 IP
  • Implicit partitioning/packing problems
  • Partitioning with identical subsets

3
Strength of LPM
  • Choice of approach depends on the relative
    difficulty in solving the two problems and on the
    convergence of the column generation and cutting
    plane algorithms in practice.

4
IP column generation
  • Branch-and-price (IP column generation) algorithm
  • If optimal solution vector
    is not integer, IPM is not yet solved.
  • zLPM ? z, so it provides an upper bound to be
    used in a branch-and-bound algorithm
  • IP
  • IPM

5
IP column generation for 0-1 IP
  • IPM(Si)

S
S0
S1
6
IP column generation for 0-1 IP
  • Branching on some fractional ?k,t variable
  • On the branch in which ?k,t 0
  • just one column corresponding to the tth solution
    ofsubproblem k is excluded
  • resulting enumeration tree will be highly
    unbalanced
  • Often difficult to impose the condition ?k,t 0
    and to prevent the same solution being generated
    again as optimal solution after branching
  • Potential advantage of column generation
  • optimal solutions to RLPM are often integral or
    close to integral
  • can provide a feasible solution

S
S0
S1
7
Implicit partitioning/packing problems
  • partitioning/packing problems
  • Given a finite set M 1, ..., m, there are K
    implicitly described sets of feasible subsets,
    and the problem is to find a maximum value
    packing or partition of M consisting of certain
    of these subsets.
  • Set xk (y k, wk) with y k ? Bm the incidence
    vector of subset k of M, ck (ek, f k), Ak
    (I, 0) and b 1 and formulate
  • IPM

8
Multi-Item Lot-Sizing

9
Clustering
  • Given a graph G (V, E), edge costs ce for e ?
    E, node weights di for i ? V, and a cluster
    capacity C, partition the node set V into K
    clusters satisfying
  • the sum of the node weights in each cluster ? C
  • the sum of the costs of edges between clusters is
    minimized orthe sum of the costs of edges within
    clusters is maximized

10
Capacitated Vehicle Routing
  • Given a graph G (V, E), a depot node 0, edge
    costs ce for e ? E, K identical vehicles of
    capacity C, and client orders di for i ? V\0,
    find a set of subtours (cycles) for each vehicle
    satisfying
  • each subtours contain the depot
  • together the subtours contain all the nodes
  • the subtours are disjoint on the node set V \ 0
  • the total demand on each subtour (total amount
    delivered by each vehicle) ? C

11
Partitioning with identical subsets
  • clustering and vehicle routing
  • clusters or vehicles are interchangeable
    (independent of k)
  • Xk X, (ek, f k) (e, f ), Tk T for all k,
  • IPM
  • subproblem
  • branching?

12
Branching rules

S
S0
S1
13
Heuristics
  • NP-hard problems
  • find a good feasible solution quickly using
    heuristics or approximations
  • a solution is required rapidly
  • the instance is large/complicated ? hard to
    formulate it as (M)IP of reasonable size
  • difficult or impossible for branch-and-bound to
    find (good) feasible solutions
  • some combinatorial problems have special
    structure to exploit to find feasible solutions

14
Heuristic design concerns
  • just accept feasible solution? or ask a
    posteriori how far it is from optimal?
  • guarantee a priori that the heuristic will
    produce a solution within ? (or ? ) of optimal?
  • worst-case bound
  • guarantee a priori that the heuristic will on
    average produce a solution within ? of optimal?
  • average-case performance

15
Combinatorial problems
  • combinatorial problems
  • 0-1 knapsack problem
  • UFL

16
Greedy heuristic
  • For a combinatorial problem
  • 1. Set S0 ?. Set t 1.
  • 2. Set
    .
  • 3. If the previous solution St-1 is feasible,
    and the cost has not decreased, stop with SG
    St-1.
  • 4. Otherwise set St St-1 ? jt. If the
    solution is now feasible, and the cost function
    is nondecreasing or t n, stop with SG St.
  • 5. Otherwise if t n, no feasible solution has
    been found. Stop.
  • 6. Otherwise set t ? t 1, and return to 2.

17
Example (UFL)

18
Greedy heuristics for STSP
  • Nearest neighbor heuristic
  • starting from an arbitrary node, greedily
    construct a path out from that node
  • complete the path to a tour by adding the edge
    between the first node and the last node in the
    path
  • Pure greedy heuristic
  • starting from an empty set S0, choose a
    least-cost edge jt s.t. St is still part of a
    tour (St consists of a set of disjoint paths,
    until the last edge chosen forms a tour)
  • Nearest Insertion
  • starting from a subtour, add a node not in the
    subtour which is closest to the subtour and add
    correspodning edges
  • drawbacks
  • requires complete graph
  • last edge can have very large weight

19
Local search
  • combinatorial problems
  • g(S) ? 0 represents a measure of the
    infeasibility of set S
  • v(S) ? k ? g(S) (k v(S))
  • For local search
  • solution S
  • local neighborhood Q(S) for each solution S
  • goal function f (S) c(S) ?g(S)
  • Local search heuristic
  • Choose an initial solution S.
  • Search for a set S?Q(S) with f (S) lt f (S).
  • If none exists, stop with a local optimum SH
    S.
  • Otherwise set S S, and repeat.

20
Choice of neighborhood
  • depends on the problem structure
  • simple neighborhood (variable size feasible
    solutions)
  • just one element is added or removed from S,
    i.e.Q(S) S S S ? j for j ? N \ S ?
    S S S \ i for i ? S
  • only O(n) elements in a neighborhood
  • fixed size feasible solutions
  • one element of S is replaced by another element
    not in S, i.e. Q(S) S S S ? j \ i
    for j ? N \ S and i ? S
  • k-exchange k elements of S is replaced by
    other k elements not in S, i.e. Q(S) S S
    S ? j1, ..., jk \ i1, ..., ik
    for j1, ..., jk ? N \ S and i1, ...,
    ik ? S

21
2-exchange heuristic for STSP
  • For STSP, there is no tour differing from an
    initial tour by a single edge.
  • If two edges are removed, there is exactly one
    other tour containing the remaining edges.
  • 2-exchange heuristic for STSP
  • A set S ? E is a solution if the set of edges
    S form a tour.
  • Q(S) S is a solution S ? S, S ? S n
    2, where n V.
  • f (S) ?e?S ce.
  • The resulting local search solution is called a
    2-optimal tour.

22
Example (STSP)
  • Nearest neighbor from 1 (1, 8, 9, 4, 10, 6, 2,
    5, 7, 3, 1) cost 318
  • Pure greedy (1, 8, 9, 4, 10, 6, 2, 3, 7, 5, 1)
    cost 323
  • Nearest insertion from (4, 9, 10, 4) (1, 8, 9,
    6, 2, 10, 3, 7, 5, 4, 1) cost 372
  • 2-exchange from the above tour (1, 8, 9, 2, 6,
    10, 3, 4, 7, 5, 1) cost 325

23
General issues in local search
  • selection of a neighborhood
  • natural perturbation of a feasible solution
  • perturbation has an order k k-exchange for
    STSP generates neighborhoods of size O(nk)
  • strong neighborhood produces local optima whose
    quality is largely related to that of starting
    solutions
  • weak neighborhood produces local optima whose
    quality is strongly related to that of starting
    solutions
  • how to search the neighborhood
  • first-improvement ? local optima found faster
  • steepest descent

24
Improving local search heuristics
  • Tabu search
  • move to the best solution in the neighborhood
    (worse solution can be selected)
  • cycling may occur ? maintain tabu list
  • Simulated annealing
  • choose a neighbor randomly, but based on
    probability related to the value of solutions
  • in the long run, should converge to a good local
    minimum
  • Genetic algorithms
  • evolve the population (set of solutions) from
    one generation to the next generation
  • mimic biological operations (crossover, mutation)

25
Tabu search
  • 1. Initialize an empty tabu list.
  • 2. Get an initial solution S.
  • 3. While the stopping criterion is not
    satisfied3.1. Choose a subset Q(S) ? Q(S) of
    non-tabu solutions.3.2. Let S arg min f (T)
    T ? Q(S).3.3. Replace S by S and update
    the tabu list.
  • 4. On termination, the best solution found is the
    heuristic solution.

26
Simulated annealing
  • 1. Get an initial solution S.
  • 2. Get an initial temperature T and a reduction
    factor r with 0 lt r lt 1.
  • 3. While not yet frozen, do the following3.1.
    Perform the following loop L times 3.1.1
    Pick a random neighbor S of S. 3.1.2
    Let ? f (S) f (S). 3.1.3 If ? ? 0,
    set S S. 3.1.4 If ? gt 0, set S S
    with probability e-?/T.3.2 Set T ? rT. (reduce
    the temperature)
  • 4. Return the best solution found.

27
Genetic algorithms
  • For each iteration (generation),
  • Evaluation evaluate the fitness of the
    individuals.
  • Parent Selection select certain pairs of
    solutions (parents) based on their fitness.
  • Crossover combine each pair of parents to
    produce one or two new solutions (offspring).
  • Mutation modify some of the offspring randomly.
  • Population Selection Based on the fitness, a
    new population is selected replacing some or all
    of the original population by an identical number
    of offspring.

28
Today
  • Greedy heuristics
  • Local search
  • Improving local search
  • Tabu search
  • Simulate annealing
  • Genetic algorithms
Write a Comment
User Comments (0)
About PowerShow.com