GRASPVNS hybrid for the Strip Packing Problem - PowerPoint PPT Presentation

1 / 46
About This Presentation
Title:

GRASPVNS hybrid for the Strip Packing Problem

Description:

Jes s David Beltr n, Jos Eduardo Calder n, Rayco Jorge Cabrera, Jos A. Moreno ... Lodi, Martello and Monaci (2002) discuss mathematical models, and survey lower ... – PowerPoint PPT presentation

Number of Views:84
Avg rating:3.0/5.0
Slides: 47
Provided by: webpag9
Category:

less

Transcript and Presenter's Notes

Title: GRASPVNS hybrid for the Strip Packing Problem


1
GRASP/VNS hybrid for the Strip Packing Problem
  • Jesús David Beltrán, José Eduardo Calderón,
    Rayco Jorge Cabrera, José A. Moreno Pérez and J.
    Marcos Moreno-Vega
  • Intelligent Computation Group
  • University of La Laguna.
  • http//webpages.ull.es/users/gci/
  • gci_at_ull.es
  • TIC2002-04242-C03-01 (70 FEDER)
  • HM 2004 - ECAI 2004, Valencia, August 22-23

2
Abstract
  • We consider an hybrid metaheuristic (HM) to solve
    the SPP.
  • SPP Strip Packing Problem.
  • The hybrid algorithm to use a VNS to improve the
    quality of part of the solution obtained with a
    GRASP.
  • VNS Variable Neighbourhood Search.
  • GRASP Greedy Randomized Adaptive Search
    Procedure.
  • The results of HM(VNS/GRASP) are compared with
    those existing in the literature for the SPP.

3
The Problem
  • The Strip Packing Problem (SPP) consists of
    packing a set of rectangles in a strip
    minimizing the height of the packing
  • the rectangles can be rotate at 900.
  • the cuttings can be non guillotine.
  • (a cutting is guillotine if it goes from a side
    of the object to the front side in a
    non-guillotine cut this can not be true)

4
Literature
  • Many researches have tackled these problems.
  • Lodi, Martello and Monaci (2002) discuss
    mathematical models, and survey lower bounds,
    classical approximation algorithms, recent
    heuristic and metaheuristic methods and exact
    enumerative approaches for the two-dimensional
    packing problems.
  • Hopper and Turton (2001) have a review of
    metaheuristic algorithms to solve two-dimensional
    strip packing problems.
  • A Genetic approach was proposed for the oriented
    two-dimensional strip packing problem in Yeung
    and Tang (2004).

5
Packing Problems
  • Packing problems involve constructing an
    arrangement of items that minimizes the total
    space required by the arrangement.
  • These problems are encountered in many areas of
    business and industry.
  • Some applications can be found in wood, glass,
    paper, textile or leather industries.
  • It must be note that small improvements can be
    very valuable for industrial applications of
    these problems.

6
The Hybrid Metaheuristics
  • The Hybrid
  • Use VNS to improve the solution provided by
    GRASP.
  • VNS operates on a part of the solution given by
    GRASP.
  • VNS attempts to improve the packing of the last
    rectangles introduced into the solution.

7
The Strip Packing Problem (SPP)
  • The Strip Packing Problem (SPP). Consider
  • a strip of fixed width w and infinite height,
  • a finite set of rectangles, with at least one of
    their sides less than w.
  • The SPP consists of packing the rectangles in the
    strip minimizing the total height h
  • (The rectangles can be rotate at 90 degrees)

8
The data of the SPP
  • The elements of an instance of
  • the Strip Packing Problem (SPP) are
  • A strip with fixed width w and infinite height.
  • A set of n rectangles
  • ? R(wi,hi) i 1, ..., n
  • Each rectangle R(wi,hi) has
  • width wi and height hi verifying
  • min wi, hi ? w.

hi
9
The solutions of the SPP
  • The solutions are given by placing
  • (ri, ai,bi) i 1, ..., n
  • of the rectangles R(wi,hi), i 1, ..., n.
  • where
  • ri represents if R(wi,hi) is rotated or not
  • if ri 1 then the R(wi,hi) is rotated at 90
    degrees
  • otherwise ri 0 and it is not rotated.
  • (ai,bi) are the coordinates of the position of
    the bottom-left corner of the rectangle R(wi,hi)
    with respect to the origin of coordinates (the
    bottom-left corner of the strip).

10
Representing Solutions
  • Representing solutions (ri, ai,bi), i 1, ...,
    n

hi
hi
hi
11
The feasible placings
  • The placing of each rectangle is feasible if
  • The following two conditions are verified
  • the rectangle is completely included in the strip
    and
  • it does not overlap with any other rectangle
    already placed.
  • In other words,
  • If ri 0 then 0 ? ai ? w wi , 0 ? bi and
  • the position of no other rectangle R( wj, hj )
    verifies
  • ai lt aj lt ai wi and bi lt bj lt bi hi
  • If ri 1 then 0 ? ai ? w hi , 0 ? bi and
  • the position of no other rectangle R( wj, hj )
    verifies
  • ai lt aj lt ai hi and bi lt bj lt bi wi

12
Unfeasible Solutions
  • Unfeasible solutions

aj
hi
13
The feasible placings
  • The placing of each rectangle is feasible if
  • The following two conditions are verified
  • the rectangle is completely included in the strip
    and
  • it does not overlap with any other rectangle
    already placed.
  • In other words,
  • If ri 0 then 0 ? ai ? w wi , 0 ? bi and
  • the position of no other rectangle R( wj, hj )
    verifies
  • ai lt aj lt ai wi and bi lt bj lt bi hi
  • If ri 1 then 0 ? ai ? w hi , 0 ? bi and
  • the position of no other rectangle R( wj, hj )
    verifies
  • ai lt aj lt ai hi and bi lt bj lt bi wi

14
The objective function
  • The objective to be minimized is
  • the maximum height h
  • reached by the rectangles
  • It is computed by
  • h max max (bihi), max (biwi)
  • ri0 ri1
  • Equivalently, the objective is
  • to minimize the area of the strip used
  • It is computed by the product
  • h w.

15
The Infinite set of Solutions
  • Infinite equivalent feasible solutions
  • could be obtained by
  • shifting horizontally or vertically
  • some rectangles.
  • To avoid it
  • consider only solutions obtained by
  • introducing successively the rectangles
  • following a given placement strategy.

16
The Bottom-left placing
  • We use the usual bottom-left strategy
  • each rectangle is placed
  • at the deepest location
  • and, within it,
  • in the most left possible location.
  • Each feasible solution of the problem
  • is determined by
  • the order in which the rectangles
  • are introduced in the strip.

17
The Solution Space
  • Given the bottom-left strategy
  • each rectangle is placed at the deepest location
    and,
  • within it, in the most left possible location.
  • The space of solutions consists of
  • the permutations of the numbers 1, 2 , n
  • with a binary vector ri, ..., rn
  • that represents if each rectangle is rotated or
    not.

18
High Quality Solutions
  • Small
  • waste
  • areas
  • Smooth
  • upper
  • contour

h
19
The GRASP metaheuristic
  • GRASP Greedy Randomized Adaptive Search
    Procedure
  • The GRASP metaheuristic has two phases
  • a constructive phase, that constructs a good
    initial solution x.
  • an improving phase, that is applied to the
    initial solution x.
  • (the improving phase consists in a restricted
    VNS)
  • Both phases are repeated until the stopping
    criterion is met.
  • Resende and Ribeiro, 2003

20
The Constructive Method of GRASP
  • A constructive method
  • an items is iteratively added to an initially
    empty
  • structure until a solution of the problem is
    obtained.
  • A heuristic constructive procedure
  • The choice of the item based heuristic
    evaluation(s)
  • measure(s) of the convenience of considering
  • the item as belonging to the solution.
  • An adaptive heuristic constructive algorithm
  • The evaluation depends on the items already in
    the solution.
  • The greedy strategy
  • Choose the item that optimize the evaluation
    (the best item)
  • A randomized rule
  • Use random numbers in the choice.

21
The GRASP rule
  • Choose (at random) one of the best items.
  • The evaluations by
  • The wasted areas
  • The smoothness of the upper contour.
  • The best items are those that best fit to the
    upper contour
  • Construct a Restricted Candidate List of items
    that conteins the best items, then choose one
    item of the list

22
The GRASP pseudocode
  • The Constructive phase
  • Let x0 be the initial empty partial
    solution.
  • Let t ? 0.
  • Repeat the steps
  • Construct the restricted candidate list RCL(t).
  • Choose at random an element et1 of RCL(t).
  • Update the partial solution xt ? xt1 U et1 .
  • Set t ? t1.
  • until there is not element in RCL(t).
  • The post-porcessing phase
  • Aply a VNS to the last k packed rectangles

23
Representing the upper contour
  • The upper contour is a series of segments from
    left to right.
  • It is represented by the coordinates of the
    segments
  • (the origin is the bottom-left corner of the
    strip)
  • C ( yi, x1i, x2i ) i 1,...,c, where, for
    the i-th segment,
  • yi is the y-coordinate (height) of its points
    and
  • x1i, x2i is the interval of their
    x-coordinates.
  • C (y1, x11, x21), (y2, x12, x22), (yi, x1i,
    x2i), (yc, x1c, x2c)
  • This sequence, verifies
  • x11 0 and x2c w,
  • x1i1 x2i, for each i 1,...,c?1.
  • yi1 ? yi, for each i 1,...,c?1.

24
Representation of upper contour
(y1,x11,x21)
  • y1
  • y4
  • y3
  • y2

(y4,x14,x24)
(y3,x13,x23)
(y2,x12,x22)
0 x11 x21 x12 x22 x13
x23 x14 x24 0
25
Computing the RCL
  • At any iteration t, the RCL(t) is constructedas
    follows
  • Let ?1(t) be the inserted rectangles and ?2(t)
    ? \ ?1(t).
  • Let C(t) (yi, x1i, x2i) i 1,...,c be the
    contour at iteration t.
  • Let sj be the lowest segment of C(t) (minimum
    height)
  • The rectangles in ?2(t) are evaluated by its
    adjustment to sj.
  • Let yj min y1, y2, ..., yc and l x2j ?
    x1j be its length.
  • Given a?0,1, RCL(t), is built as follows.
  • RCL(t) R(wi,hi) ? ?2(t) l ?a wi l or
    l ?a hi l .

26
Updating the upper contour
(yj?1,x1j?1,x2j?1)
  • yj?1
  • yj1
  • yj

(yj1,x1j1,x2j1)
(yj,x1j,x2j)
x1j x2j
27
Update the RCL
  • If there is not rectangle to choose (with some
    side less than l) then
  • The rectangle x1j,x2jyj,minyj ?1,yj1
    becomes a wasted area.
  • If yj1 yj?1 (the other case is similar) then
    replace
  • the segments (yj,x1j,x2j) and
    (yj1,x1j1,x2j1) by (yj1,x1j,x2j1) .
  • ( the size of the contour decreases i.e.,
    C(t1) C(t) ? 1 )
  • Otherwise,
  • If R(wi,hi) is the selected rectangle of RCL(t)
    then
  • replace (yj, x1j, x2j) by (yjhi, x1j,
    x1jwi) and (yj, x1jwi, x2j).
  • ( the size of the contour increases i.e.,
    C(t1) C(t) 1 )

28
The VNS Metaheuristics
  • Variable Neighbourhood Search (VNS) is a recent
    metaheuristic based on systematic change of the
    neighbourhood in a search.
  • Usual heuristic searches are based on
    transformations of solutions that determine a
    neighbourhood structure on the solution space.
  • A neighborhood structure in a solution space S is
    a mapping
  • N S ? 2S. x ? N(x).
  • The solutions of y ? N(x) are the neighbor
    solutions of x
  • and constitute its neighborhood N(x).
  • VNS uses a series of neighborhoods Nk, k 1, ,
    kmax.
  • A local search takes a better neighbor while
    possible.
  • The basic idea of the VNS is to change the
    neighbourhood structure when the local search is
    trapped on a local minimum.

29
VNS Variable Neighbourhood Search
  • Initially it takes k ? 1 and a random initial
    solution x.
  • A local search from x with the neighborhood N1
    returns x.
  • These steps are repeated until the stopping
    condition is met.
  • Take k ? k1 and a random neighbor y of the local
    optimum x using in the neighborhoods Nk i.e., y
    ? Nk(x).
  • Apply the local search from y using the
    neighborhoods N1.
  • If the local minimum found y is better than x
    then the process is iterated with x ? y and k
    1.
  • Otherwise, iterate the process with the current
    solution x.
  • When k kmax, generate a new initial solution x
    and iterate the process with k ? 1.
  • Hansen Maldenovic 2001,2003,2004

30
VND Variable Neighbourhood descent
  • Given a neighbourhood structure, an improving
    local search iteratively seeks for a better
    solution in the neighbourhood of the current
    solution therefore it is trapped in a local
    optimum with respect to the current neighbourhood
    structure.
  • The VND (Variable Neighborhood Descent) method
    changes the neighbourhoods Nk each time a local
    optimum is reached.
  • It ends when there is not improve with the all
    neighbourhoods
  • The final solution provided by the algorithm
    should be a local optimum with respect to all
    kmax neighbourhoods.

31
A simple VNS
  • A VNS can be implemented by the combination of
    series of random and improving (local) searches.
  • When the improving local search stops at a local
    minimum, a shake procedure performs a random
    search for a new starting point for a new local
    search.
  • The improving local search and the random shake
    procedure are usually based on standard moves
    that determine neighbourhood structures.
  • A basic local search consists of applying an
    improving move until no such move exists.
  • A simple shake procedure consists of applying a
    number of random moves.
  • The stopping condition in a VNS may be maximum
    CPU time allowed, maximum number of iterations,
    or maximum number of iterations between two
    improvements.

32
The General VNS
  • The GVNS (General Variable Neighborhood Search)
    method applies two (possibly different) series of
    neighbourhoods
  • one for the shaking and one for the descent.
  • Several applications use the same set of
    neighbourhood structures for shaking and descent.
  • The neighbourhoods are often nested and based on
    a single standard move.
  • Given a base neighbourhood structure N S ? 2S,
    the series of nested neighbourhood structure is
    defined as follows.
  • The first neighbourhoods are the basic ones
    N1(x) N(x).
  • and the next neighbourhoods are defined
    recursively by Nk1(x) N(Nk(x)).
  • Nk(x) consists of the solutions that can be
    obtained from x by k base moves.

33
The Nested VNS
  • Initialization
  • Find an initial solution x.
  • Set x ? x and k ? 1.
  • Iterations
  • Repeat the following until the stopping
    condition is met
  • Shake
  • Apply k random moves to the solution x to get
    x.
  • Local Search
  • Apply an improving move to the solution x
    until a local minimum x is found.
  • Improve or not
  • If x is better than x, do x ? x and k ? 1.
  • Otherwise do k ? k1. Set x ? x.

34
Further details
  • For the local search we use the greedy strategy
    at step 2b an iterative procedure tests all the
    base moves, and that providing the best solution
    is made until no improving move exists.
  • The shake consists of applying k of random base
    moves.
  • This number of random moves is the size of the
    shake.
  • The base move for problems with solutions
    represented by permutations is the interchange
    move that consists in the exchange of the
    position of two elements of the permutation.
  • The random generation of solutions is performed
    by successively selecting one of the remainder
    elements with the same probability.

35
The Hybrid approach
  • The hybrid approach proposed is obtained by using
    a nested Variable Neighborhood Search in the
    post-processing phase of the above GRASP.
  • The GRASP constructs a solution by inserting at
    random a rectangle of the RCL it gets small
    interior waste.
  • The VNS gets a better packing of the last packed
    rectangles it increases the smothness of the
    upper contour

36
The Hybrid approach
  • The last packed rectangle(s) by GRASP

37
The VNS/GRASP hybrid
  • Apply the contructive phase of the GRASP.
  • Let R1, R2, , Rn be the solution provided by
    the GRASP.
  • Let R1, R2, , Rk be the last k rectangles
    of the solution.
  • Apply the nested VNS on the solution space
    consisting of
  • all the permutations of the rectangles R1, R2,
    , Rk
  • to be packed in the free strip.
  • Let R1, R2, , Rk be the best permutation
    obtained by the nested VNS
  • Return the solution R1, R2, , Rn?k,R1, R2, ,
    Rk.

38
Computational Experiments
  • Tuning the parameter a for constructive phase of
    the GRASP.
  • the parameter that controls the fitness of the
    rectangles to the lowest segment of the upper
    contour to construct the LCR
  • Randomly generated packing instances were solved.
  • Comparative between GRASP/VNS and SABLF
  • SABLF is a Simulated Annealing with the
    Bottom-Left rule proposed by Hopper and Turton
    (2001)
  • GRASP(/VNS) are very much faster.
  • Set of instances from Hopper and Turton (2001)
  • Comparative between GRASP and GRASP/VNS
  • VNS significatively improve GRASP?
  • GRASP/VNS improves GRASP for large instances
  • Set of instances from Hopper and Turton (2001)

39
Tunning the parameter a
  • We fix several values for a. 0, 0.1, 0.5.
  • Each method was run 5 times for the instances
    randomly generated.
  • The output variable was the average objective
    value in the niter 40 executions of the
    constructive phase
  • Therefore we considered three treatments T(0),
    T(0.1) and T(0.2).
  • The previous normality and variance equality
    tests were negative.
  • We applied the Friedman nonparametric (Daniel
    1990).
  • When the null hypothesis of equality among
    treatment was rejected,
  • we applied the Friedman multiple comparison
    tests (Daniel 1990, page 274)
  • to obtain the significance of the differences
    among treatments.

40
Results for tunning a
41
Tunning a with big instances
42
Test Instancesfor thecomparativeanalysis
  • The instances used in
  • Hopper and Turton
  • (2002)
  • Data are available on
  • the OR-Library at
  • http//mscmga.ms.ic.ac.uk/jeb/orlib/stripinfo.html

43
Comparative with SABLF
44
Comparative for hybridization
45
Conclusions
  • GRASP and GRASP/VNS have better performance than
    SABLF
  • It provides a great decreasing in the running
    time (for instances in C7 it goes from 4181
    minutes to 1.37 secs or less)
  • GRASP and GRASP/VNS have comparable or better
    efficacy than SABLF.
  • GRASP/VNS has slightly better minimum objective
    value than GRASP
  • In the majority of the problems it gives better
    minimum value than GRASP.
  • GRASP/VNS is more robust than GRASP
  • the difference between the maximum and minimum
    objective values is smaller in GRASPVNS than in
    GRASP.
  • GRASPVNS is more useful when the number of
    rectangles increases (the number of improvements
    increases with the size of the problem).
  • (this trend must be studied with new
    computational experiments)

46
GRASP/VNS hybrid for the Strip Packing Problem
THANKS
  • Jesús David Beltrán, José Eduardo Calderón,
    Rayco Jorge Cabrera, José A. Moreno Pérez and J.
    Marcos Moreno-Vega
  • Inteligent Computation Group
  • University of La Laguna.
  • http//webpages.ull.es/users/gci/
  • gci_at_ull.es
  • Proyecto TIC2002-04242-C03-01 (70 FEDER)
Write a Comment
User Comments (0)
About PowerShow.com