Information Theoretic Image Thresholding - PowerPoint PPT Presentation

1 / 38
About This Presentation
Title:

Information Theoretic Image Thresholding

Description:

Investigate mixture modelling as approximation to image's histogram ... Good for images with fairly distinct, homogeneous regions ... – PowerPoint PPT presentation

Number of Views:49
Avg rating:3.0/5.0
Slides: 39
Provided by: laura171
Category:

less

Transcript and Presenter's Notes

Title: Information Theoretic Image Thresholding


1
Information Theoretic Image Thresholding
  • Laura Frost
  • Supervisors Dr Peter Tischer
  • Dr Sid Ray

2
Aims of Project
  • Investigate mixture modelling as approximation to
    images histogram
  • Investigate relative / objective criterion for
    assessment of thresholding results

3
Thresholding
  • Good for images with fairly distinct, homogeneous
    regions
  • Region of uncertainty object boundaries
  • Two important properties for mixture modelling
    thresholding

4
Mixture Modelling
  • Approximate complex distribution with component
    distributions
  • Can describe components easily
  • Classify data in this case pixels

5
An example
6
Need to ask
  • Of thresholding
  • How good is a threshold?
  • Of mixture modelling
  • When is a mixture model a good model?

7
And so keep in mind
  • A good mixture model implies a good threshold
  • A good threshold does not necessary imply a good
    mixture model

8
Methodology 1
  • Test / extend / possibly improve iterative
    mixture modelling
  • Examine Kullback-Leibler measure as possible
    relative / objective criterion

9
Methodology 2
  • Test mixture modelling image histograms using
    Snob
  • Examine Minimum Message Length as possible
    objective criterion

10
Iterative Mixture Modelling
  • Fit mixture model at each grey-level
  • Select grey-level that produces best model
  • Good for bi-level thresholding

11
Implementation
  • Based on work completed by David Bertolo (Honours
    2001, Monash)
  • Distributions
  • Poisson (1 parameter)
  • Gaussian, Rayleigh (2 parameters)

12
Implementation
  • Improve threshold selection use intersection of
    components
  • Account for overlap

13
Testing
  • Synthetic and natural images
  • Synthetic images created with specific
    distributions
  • Test accuracy of model fitting
  • Give lower bound for Kullback-Leibler measure
    assessment

14
Synthetic Images
15
(No Transcript)
16
Results Iterative Mixture Modelling
  • Component parameters correct for synthetic images
  • Component parameters for natural images (esp.
    outliers at boundaries)
  • Subjective assessment of segmented image

17
Results Iterative Mixture Modelling
  • Examined five information measures
  • Entropy of image H(p)
  • Entropy of mixture model H(q)
  • Kullback-Leibler (KL) measure I(pq)
  • KL relative to entropy of image I(pq) / H(p)
  • KL relative to entropy of model I(pq) / H(q)

18
Results 2, 3, 4 components
  • Good fit for synthetic images
  • Gaussian (?)
  • Poisson (?)
  • Rayleigh opposite (?)
  • Rayleigh right (?)

19
Results 2, 3, 4 components
  • Dealt with outliers at boundaries
  • Gaussian (?)
  • Poisson (?)
  • Rayleigh opposite (?)
  • Rayleigh right (?)

20
Results 2, 3, 4 components
  • Overall, segmented images good quality
  • Gaussian (?)
  • Poisson (?)
  • Rayleigh opposite (?)
  • Rayleigh right (?)
  • Except for images with outliers

21
Results 2, 3, 4 components
  • Time unreasonable for 4 components (complexity
    increases exponentially)
  • Poisson takes 4 times longer than other
    distributions

22
Matches example
23
3 Gaussian mixture model
24
3 Poisson mixture model
25
3 Rayleigh mixture model
26
Results 2, 3, 4 components
  • Gaussians
  • H(p) lt H (q) for all images
  • Poissons
  • H(p) gt H (q) for all successfully thresholded
    images
  • Rayleighs
  • H(p) H(q)

27
Results 2, 3, 4 components
  • I(pq) decreased as no. components increased to
    be expected
  • I(pq) / H(p)
  • I(pq) / H(q)

28
A relative criterion
  • Is there value in comparing models of different
    complexities?
  • From these results, probably not
  • But comparing models of similar complexities
    looks ok

29
Mixture modelling with Snob
  • Problem overfitting data on natural and
    synthetic images
  • Eg, 512 x 512 image has 262 144 pixels to
    classify
  • Cheaper for Snob to make more classes

30
Sampling
  • Randomly sampling data at different rates
  • Snob finding very good classes!
  • Sampling image alumina.gif at 100 pixels (from
    total 65 536)

31
Alumina example
  • Over many runs found two classes
  • 0.20 N(91.80, 23.20)
  • 0.80 N(206.80, 11.40)
  • Compare to Iterative method
  • 0.19 N(94.30, 26.56)
  • 0.81 N(206.21, 11.80)

32
Alumina example
  • Message length 4.94 bpp
  • Not all images so successful at just 100 pixels
  • All seem to be ok at about 500 pixels

33
Snob and Thresholding
  • Sampling at such small rates, Snob handling
    missing data very well!
  • Since need to sample at such small rates, did not
    compare Poissons as hoped
  • More work needs to be done, but looks promising

34
An Objective Criterion
  • Takes complexity of mixture model into account
    when calculating message length
  • Message Length a very good candidate for use as
    an objective criterion for thresholding

35
Aims of Project
  • Investigate mixture modelling as approximation to
    images histogram
  • Investigate relative / objective criterion for
    assessment of thresholding results

36
Conclusion
  • Iterative Method
  • Consistent results
  • Optimal no. of components trial and error
  • Complexity
  • Snob
  • Problem with overfitting large no. of data
    points
  • Sampling at very tiny rates working well

37
Evaluation Criteria
  • Kullback-Leibler measure
  • Relative to Entropy
  • Relative to model complexity
  • Minimum Message Length
  • Promising as objective criterion

38
Future Work
  • Better way to initially assign pixels to classes
  • Modify Snob to do this
  • More testing with Snob
  • Addition of more distributions to Iterative method
Write a Comment
User Comments (0)
About PowerShow.com