Base Paper Discussion: A Hopfield Neural Network for Image Change Detection - PowerPoint PPT Presentation

1 / 21
About This Presentation
Title:

Base Paper Discussion: A Hopfield Neural Network for Image Change Detection

Description:

Data Consistency. Probabilistic Model 'Expectation Maximization', or EM method ... most robust against noise. Simulation. Major obstacle: Probabilistic model ... – PowerPoint PPT presentation

Number of Views:118
Avg rating:3.0/5.0
Slides: 22
Provided by: phil229
Category:

less

Transcript and Presenter's Notes

Title: Base Paper Discussion: A Hopfield Neural Network for Image Change Detection


1
Base Paper DiscussionA Hopfield Neural Network
for Image Change Detection
  • Phillip Anderson
  • ENEE434, Spring 2007
  • Professor Newcomb

2
Image Change Detection
  • Goal identify the set of pixels that are
    significantly different between the last image
    and a previous reference image
  • Examples
  • Video frames (surveillance footage)
  • Satellite images (environmental, political)
  • Industrial automation (quality control)

3
Image Change Detection Historical Perspective
  • Temporal Difference Models Simple, sometimes
    effective- Sensitive to noise, lighting
  • Significance and Hypothesis Tests Models Less
    sensitive to variations in lighting, noise in
    image- Complicated, very computationally
    intense
  • Vector and Shading Models Decreased
    noise/lighting sensitivity- Difficult to
    automate (empirical)
  • Clustering Models Increased noise rejection,
    accuracy- Complicated requires a longer series
    of images

4
Motivation for Neural Network Implementation
  • Problems with previous systems
  • May require empirical adjustments
  • Limited by inherent assumptions
  • Some require a larger image sequence
  • Noise rejection is generally sub-optimal
  • Neural network implementation combines strong
    points of several approaches

5
Benefits of Neural Network Implementation
  • Based on a probabilistic approach
  • Decision about a pixel based on
  • data consistency from difference image
  • contextual consistency from neighbors
  • self-data about the individual pixel
  • Output strength of change in each pixel

6
Hopfield Network
  • Analog Hopfield (more resistant to local minima)
  • One node for each pixel in the difference image
  • Node state holds changed/unchanged info
  • Network seeks to minimize the Energy Function

7
Network Parameters
  • Layer weights contextual information
  • Stability requires symmetric weighting
  • Biases self-data information
  • Input to node i given by
  • Activation function hyperbolic tangent,

8
Energy Function
  • General form
  • Integral term has little contribution to
    stability dropped
  • layer (interconnection) weight
  • current state of node
  • node bias
  • What does this equation represent?
  • Contributions from
  • Data consistency,
  • Contextual consistency,
  • Self-data,

9
Data Consistency
  • Probabilistic Model
  • Expectation Maximization, or EM method
  • Estimate mean variance for two sets (unchanged
    and changed pixels)
  • To initialize EM method
  • Threshold difference image (diagram, above right)
  • Define two groups, based on pixel intensities
  • EM method converges on actual group definitions
    (diagram, below right)

10
Contextual Consistency
  • Relationship between pixel and nearest neighbors
  • In terms of node states
  • Equation
  • Maximum when all 9 states are the same minimum
    when center state opposes edge states

11
Numerical Representation of Data/Contextual
Consistency
  • How to apply these to the energy function?
  • Goal Both consistencies high gt energy low (and
    v.v.)
  • Data Consistency
  • Contextual Cons.
  • Combined Energy Equation (first term in final
    equation)

12
Self-data and Final Energy Equation
  • Self-data node with high probability of being
    changed should reflect that fact in the state
    value
  • Combine the three data sources to obtain
  • Compare with general form (previously given)
  • Result equations for connection weights and
    input bias!

13
Putting it all together Functional Summary
  • Hopfield Network seeks to minimize the energy
    function
  • Energy function expressed through interconnection
    weights and biases
  • Solution means that the network outputs (node
    states) correspond to the differences between the
    images

14
Initialization Process
  • Calculate thresholds (as shown in previous
    slide)
  • Load initial state values
  • Compute connection weights and biases with EM
    method
  • Iterate
  • For each node, compute u(t) using Runge-Kutta
    method
  • Update node state accordingly
  • Continue looping until states have stabilized

15
Paper Results - Outdoors
16
Paper Results Indoors
17
Comparative Tests
  • Hopfield (HNN) compared with six other change
    detection algorithms
  • MTD Modified Temporal Difference
  • LIU Shading Model
  • MAP Statistical Model
  • SKI Vector/Shading Model
  • CAR Cluster Statistical Model
  • BRU Contextual, no Self-Data

18
(No Transcript)
19
(No Transcript)
20
Base Paper Conclusions
  • Iterative methods (HNN, BRU) offer improved
    performance
  • Initialization can be important, but is not
    decisive
  • HNN/BRU still stabilize HNN is quicker
  • Time complexity of iterative methods is extremely
    high
  • May not always be practical
  • Real-time applications require parallelism
  • HNN is the most robust against noise

21
Simulation
  • Major obstacle Probabilistic model
  • EM method is extremely involved
  • Implementation requires background knowledge of
    probability distributions
  • Good news initialization isnt critical!
Write a Comment
User Comments (0)
About PowerShow.com