Convex and Nonconvex Relaxation Approaches - PowerPoint PPT Presentation

1 / 61
About This Presentation
Title:

Convex and Nonconvex Relaxation Approaches

Description:

... have different approaches based on (approximations) of the ... Numerical evidence: local minimizers. Chan-Vese algorithm seems to compute global minimizers ... – PowerPoint PPT presentation

Number of Views:49
Avg rating:3.0/5.0
Slides: 62
Provided by: mbu91
Category:

less

Transcript and Presenter's Notes

Title: Convex and Nonconvex Relaxation Approaches


1
Convex and Nonconvex Relaxation Approaches
2
Topology Optimization State of the Art
  • Various approaches have been proposed for solving
    topology optimization problems in imaging and
    engineering applications
  • Particular success has been achieved with
  • Level Set Methods
  • Topological Asymptotics
  • Ad-hoc approximations (e.g. SIMP / RAMP)

3
Topology Optimization State of the Art
  • In particular combinations of level set methods
    and topological asymptotics have become a
    powerful tool, which can be tuned to fast local
    convergence in many applications
  • mb-Hackl-Ring 04, Allaire et al 05-08,
    Amstutz-Andrä 05, Wang et al 05-08, Hackl 07, He
    et al 06, Fulmanski et al 08
  • In some respects, there are still drawbacks

4
Topology Optimization State of the Art
  • Disadvantages
  • Volume constraints difficult (level set methods
    approximate signed distance functions, no
    continuous dependence of volume)- Linear
    constraints and simple relations for indicator
    function are destroyed (significant nonlinearity
    in level set methods)
  • Methods only converge local even for simple
    objectives (possible global convergence by
    topological derivatives, but difficult to check
    and with potentially high effort)

5
Indicator Function
  • In some (few) cases it may be desireable to have
    different approaches based on (approximations) of
    the indicator functionNon-convex constraint
    since indicator function only takes values zero
    and one

6
Example I Imaging
  • In segmentation often difficulties appear if
    local minimizers are computed only (objects not
    found, not characterized well )
  • Example piecewise constant Mumford-Shah model (
    Chan-Vese model)J is perimeter
    regularization (total variation)

7
Example I Imaging
  • Numerical evidence local minimizers
  • Chan-Vese algorithm seems to compute global
    minimizers
  • (exact minimization w.r.t. constants alternated
    with gradient step)only with globally
    smeared Dirac delta, e.g.

8
Example II Local Stress Constraints
  • Structural Topology Optimization with local
    stress constraints
  • Subject toSimilar for von Mises stress

9
Example II Local Stress Constraints
  • Bilinear constraint creates enormous trouble
  • In particular no constraint qualification (e.g.
    Slater) !
  • Linear reformulation (Stolpe-Svanberg 02)

10
Relaxation
  • It is attractive to introduce a relaxation of
    this constraint, i.e. look for a function u such
    that
    instead of How to maintain the connection
    with the original optimization problem ?

11
Relaxation
  • Basically there are two options
  • Convex relaxation obtain some reformulation of
    original optimization problem and try to
  • Nonconvex relaxation penalize deviation from
    desired valuese.g.

12
Relaxation
  • and
  • Convex relaxation needs special structures and
    special investigations in order to be applied. If
    it can be applied, it is possible to compute
    global minima
  • Nonconvex relaxation can be applied in a
    universal way. Does in general not help with
    global minimization, at least continuation in the
    penalization parameter is possible

13
Nonconvex Relaxation
  • Replace minimization of H subject to 0-1
    constraints byNote interesting only if H
    penalizes oscillations, otherwise complexity
    explodes (infinite-dimensional combinatorial
    optimization)
  • Typically via perimeter constraint

14
Nonconvex Relaxation
  • Further approximation on perimeter term possible
  • For appropriate scaling of the penalty term P
    this approach yields Gamma-convergence to the
    original variational problemas epsilon tends to
    zero (cf. Modica-Mortola 87)
  • Resulting problems have simple structure
    (quadratic regularization), but are
    parameter-dependent

15
Nonconvex Relaxation
  • Parameter-dependence to be exploited in two
    instances
  • Discretization adaptivity needed to resolve
    arising interfaces at width of order epsilon
  • Continuation Convex problems for large epsilon,
    decrease
  • epsilon to obtain optimal topologies

16
Nonconvex Relaxation
  • Structural topology with local stress
    constraints

17
Nonconvex Relaxation
  • Structural topology with local stress
    constraints
  • Quadratic objective functional, linear inequality
    and equality constraints (huge number)
  • Numerical solution (mb-Stainko 06, Stainko 06)
  • FE-Discretization in NGSolve Minimization with
    interior point code IPOPT
  • Parameter-robust multigrid preconditioning and
    iterative solutionof linear systems in each
    iteration step of IPOPT

18
Nonconvex Relaxation
  • Long beam, load on bottom, constrained von Mises
    stress

19
Nonconvex Relaxation
  • Short beam, load on top, constrained total stress

20
Nonconvex Relaxation
  • Short beam, load on top, constrained total stress

21
Nonconvex Relaxation
  • Observations
  • Starting with small epsilon basically equivalent
    to level set based local optimization
    (phase-field method, ask Charlie Elliott)
  • Additional freedom of continuation in epsilon.
    Sufficiently small decrease usually yields global
    (topological) optima.
  • As epsilon gets small adaptive refinement is
    necessary to resolve diffuse interface.
    Adaptation is potential danger for global
    optimization. Again ok with some care.
  • Good solver for convave minimization is needed

22
Convex Relaxation
  • Convex relaxation approach can be considered at
    least for the following structure, including a
    state variable v and the design variable
    uMinimization with respect to u and v can
    be done subsequently, we start with uJ total
    variation (perimeter for 0-1 functions)

23
Convex Relaxation
  • Minimization with respect to u is of the
    formObvious relaxation of the form

24
Convex Relaxation
  • Minimization with respect to u is of the form-
    Is the indicator function solution of the relaxed
    problem ?
  • - If yes, how can we compute such special
    solutions ?
  • - If yes, are solutions stable with respect to g
    ?
  • (Needed due to data noise in imaging and in order
    to understand alternating minimization)

25
Convex Relaxation
  • Layer-cake representation
  • Co-area formula

26
Convex Relaxation
  • Hence the functional can be decomposed into level
    sets
  • For solution of original problem and
    solution of relaxed problem

27
Convex Relaxation
  • This implies (for almost every t )
  • Indicator function is also solution of relaxed
    problem
  • Almost every level set of relaxed solution is a
    solution of the original problem
  • Chan-Esedoglu 04, Chan-Esedoglu-Nikolova 04

28
Convex Relaxation
  • Solve relaxed problem and take level sets to
    solve original problem
  • Since relaxed problem is convex, we can guarantee
    to find global optimum

29
Stability
  • In general, level sets are not stable with
    respect to (weak) BV convergence
  • Use
  • to write

30
Stability
  • Hence, solution of original problem also solves a
    quadratic problem with 0-1 constraint
  • Stability for quadratic problem can be used to
    obtain weak stability for original and relaxed
    problem with respect to g
  • Standard proof implies for minimizersSet-valu
    ed weak convergence to minimizers of original
    problem

31
Quantitative Stability
  • Relaxation can also be used to obtain
    quantitative stability estimates
  • Let Equivalent minimization

32
Quantitative Stability
  • Optimality condition
  • Difference of optimality conditions for different
    data yields

33
Quantitative Stability
  • With upper and lower bound on u one can estimate
    generalized Bregman distance
  • Direct estimate for subgradientsDetailed
    interpretations of subgradients for total
    variation can yield information about closeness
    of contours

34
Applications
  • Several applications can be put in the general
    form by appropriate choice of v

35
Applications
  • Minimal compliance in structural optimization
  • v is displacement, G is energy density
  • Optimal design of composite membranes /
    photonic crystalMinimal eigenvalue for
    Helmholtz-equation
  • Not directly applicable to eigenvalue
    maximization and bandgaps

36
Applications
  • Chan-Vese segmentation model
  • Try to find a partition of the domain in two
    typical mean gray values Chan-Vese 99

37
Applications
  • Chan-Vese segmentation model
  • After rescaling, Chan-Vese algorithm with
    smoothed delta can be interpreted as a (almost)
    projected gradient descent on the relaxed
    functionalP damps updates out of feasible
    set

38
Applications
  • Chan-Vese segmentation model

Data courtesy of Institute for Physiology, WWU
39
Applications
  • Similar for other priors in regions
  • Region-based Mumford-Shah Piecewise smooth in
    subregions
  • Histogram-based maximally different histograms
    in the subregions
  • Esedolgu et al 07

40
Applications
  • Adaptive Priors
  • Example parametrized anisotropic
    perimeterMR-T1 Image Chan-Vese
    Segmentation

41
Adaptive Priors
  • Problem perimeter constraint leads to cut-off of
    small elongated structures (sulci in the brain)

42
Adaptive Priors for Brain Imaging
  • Sulci are important for applications to EEG/ MEG
    inversion
  • Brain activity is modeled by dipoles at the sulci
    boundaries pointing in normal direction

Baillet et al, 2001
43
EEG/MEG Inversion
  • Dipole activity creates electric and magnetic
    field on / outsidethe human skull. Measured by
    EEG / MEG

Simulate quasistatic
Maxwell equations
44
EEG/MEG Inversion
  • Inversion tries to find dipole location from
  • measured EEG / MEG data
  • Highly undeterdetermined, needs
  • strong prior knowledge on possible
  • dipole source locations and orientationsObtained
    from segmentation and classification of MR
    images

45
Brain segmentation
  • Usual prior in terms of signed distance function
    to the surfaceIsotropic perimeter is curve
    integral ofFavours rounded structures
    (circle-like)
  • Can be understood from optimality conditions and
    subgradients (related to mean curvature).
    Alternatively from isoperimetric problems,
    minimizing perimeter at fixed volume (Wulff shape)

46
Brain segmentation
  • In order to allow elongated structures we
    useAnisotropic perimeter, curve integral
    ofFavours elongated structures (ellipses)
  • Elongation more and more pronounced with a
    tending to 0 and 1

47
Brain segmentation
  • In the above definition, main axes of the
    ellipses lie in coordinate directions. We still
    need to introduce a rotation (angle b )

48
Brain segmentation
  • Isoperimetric problems at fixed a and bCan be
    seen from corresponding anisotropic mean
    curvature flow (descent flow for regularization
    functional)

49
Brain segmentation
  • Segmentation result
  • at fixed a and b

50
Brain segmentation
  • Local definition of a and in particular b is
    needed
  • Iterate segmentation with update in each pixel

51
Brain segmentation
  • Extension to three dimensions,
  • two angles needed

52
Brain segmentation
  • Adaptive prior in segmentation and
    classification of normal directions

53
Generalization
  • Generalizations to more general structure

54
Generalization
  • Simple idea add new variable w u and penalize
    the constraint
  • (Moreau-Yosida Regularization of the first part)

55
Generalization
  • Use again u2 u before relaxing
  • Relaxation exact, still convex problem for u at
    fixed v and w
  • convex for w at fixed v and u. Overall functional
    nonconvex

56
Generalization
  • Test example simplest inverse obstacle problem

57
Generalization
  • Evolution of alternating minimization approach,
    no data noise

58
Generalization
  • Evolution of alternating minimization approach,
    3 data noise

59
Generalization
  • Test example simplest inverse obstacle problem

60
Mathematical Imaging_at_WWU
  • Christoph Brune Alex Sawatzky Frank Wübbeling
    Thomas Kösters Martin Benning Marzena Franek
  • Bärbel Schlake Christina Stöcker Mary
    Wolfram Thomas Grosser Jahn Müller

61
Based on further collaborations with
  • Michael Hintermüller (Graz)
  • Roman Stainko (Linz / DTU Lyngby)
  • Denis Neiter (Ecole Polytechnique, Internship at
    WWU)
  • Carsten Wolters (WWU, University
    Hospital)Funding Regularization with Singular
    Energies (DFG), SFB 656 (DFG), Cartoon-Reconstruct
    ion and Segmentation in Nanoscopy (BMBF),
    European Institute for Molecular Imaging (WWU
    SIEMENS Medical Solutions)
Write a Comment
User Comments (0)
About PowerShow.com