Multimodal Retinal Imaging - PowerPoint PPT Presentation

1 / 25
About This Presentation
Title:

Multimodal Retinal Imaging

Description:

Similarity measure between images based on entropy: ... Freedman-Diaconis: 2 * IQR * n-1/3 ... Using freedman-diaconis bin size: At the incorrect registration: ... – PowerPoint PPT presentation

Number of Views:64
Avg rating:3.0/5.0
Slides: 26
Provided by: scie205
Category:

less

Transcript and Presenter's Notes

Title: Multimodal Retinal Imaging


1
Multi-modal Retinal Imaging
  • Improvements for performing accurate and
    efficient image registration

Phil Legg Cardiff School of Computer Science
February 2009
2
Contents
  • Introduction to Mutual Information
  • Probability Estimation
  • Gauge Co-ordinates
  • Further Improvements
  • Windowed Mutual Information
  • Local Mutual Information
  • Elastic Registration

3
Introduction
4
Mutual Information
  • Similarity measure between images based on
    entropy I(A,B) H(A) H(B) H(A,B)
  • Aim is to maximise the individual entropy values
    whilst minimising the joint entropy.
  • By searching the transformation space, alignment
    should occur where max(I(A,B)).

5
Not always the case...
001,003 (simanneal), 011,121 (fminsearch)
6
Transformation Search
  • Exhaustive search of transformation parameters is
    very expensive.
  • Optimisation methods may be caught by local
    maxima.
  • We need a search that successfully converges to
    the maximum.
  • Assuming that the similarity measure peaks at the
    correct solution.
  • We use Simplex and Simulated Annealing.

7
Probability Estimation
  • Entropy is computed from the probability
    distribution of a data set.
  • How we compute the probability distribution can
    alter the entropy value and registration
    accuracy.
  • Simplest approach to probability estimation is
    using a histogram.

8
Histogram Bin Size
256 histogram bins
32 histogram bins
Incorrect Registration (Found using 256 bins)
Correct Registration (Found using 32 bins)
9
Histogram Bin Size
  • Statistics literature gives many suggestions to
    selecting optimal bin size
  • Sturges 1 log(n)
  • Scott 3.5 SD n-1/3
  • Freedman-Diaconis 2 IQR n-1/3
  • We incorporate bin selection with Mutual
    Information to improve the density estimate for
    our data

10
Histogram Bin Size
Using 256 histogram bins At the incorrect
registration 7.6973 7.0895 13.9393
0.8475 At the correct registration 7.6346
6.8210 13.6974 0.7582
Incorrect registration
Using freedman-diaconis bin size At the
incorrect registration 5.8884 5.9496
11.3623 0.4758 (73x117 bins) At the correct
registration 5.8240 6.2937 11.5901
0.5276 (73x189 bins)
Correct registration
11
Histogram Bin Size
  • Reduction of bin size can help improve statistics
    for entropy calculation
  • Mutual Information still fails to give good
    success rate
  • Weak correspondence between intensities

12
Gauge Co-ordinates
  • Mutual Information has little spatial information
    that may improve registration
  • Structure of images should be similar across
    modalities
  • Incorporate structure and neighbourhood
    information into Mutual Information

13
Gauge Co-ordinates
Covariance matrix (size d x d)
m x n
CA
C
d
CB
f
14
Gauge Co-ordinates
15
Further Improvements
  • Probability estimate methods
  • Improves runtime but still quite poor accuracy.
  • Incorporating gauge co-ordinates
  • High accuracy rate but very long to compute
    (approximately 10-12 minutes).
  • Can we improve both accuracy and runtime?
  • Windowed Mutual Information?

16
Windowed Mutual Information
  • Break down the image into smaller window.
  • Take a combined score based on each individual
    window.
  • Can weight individual windows based on number of
    pixels or position in image.
  • Aims to find where all windows give strong
    correspondence (rather than image as a whole).

17
Windowed Mutual Information
0.4149 0.4255 0.3055
0.1528 1.0048 0.5185
0.3418 0.4858 0.5524
Standard Mutual Information using 32 histogram
bins
18
Windowed Mutual Information
  • Small windows may give weak statistics for Mutual
    Information
  • Histogram bin size selection accounts for less
    samples.
  • Large windows may defeat purpose of splitting the
    image
  • 3x3 grid seems to suit our image data well
  • Optic Nerve Head in centre position
  • Blood vessels in outer windows

19
Windowed Mutual Information
20
Elastic Registration
  • Some local misalignments occur for our image data
  • Deformation correction required between HRT2
    image and fundus photograph
  • Find initial registration at a coarse level then
    use elastic deformation to try improve
    registration

21
Local Mutual Information
  • Find a global registration, then register small
    windows on a local basis (with limited
    translation range).
  • Use new position as marker for performing
    deformation.
  • May require thresholding to decide whether marker
    point should be used or not.

22
Local Mutual Information
23
Local Mutual Information
24
Local Mutual Information
  • Early results suggest about 2-3 minutes runtime
  • Windowed MI at coarse resolution
  • Local MI at full resolution
  • Elastic deformation
  • Much faster than our previous approach and will
    hopefully offer improved registration with
    elastic deformation

25
Thank you!
Write a Comment
User Comments (0)
About PowerShow.com