Three Brown Mice: See How They Run - PowerPoint PPT Presentation

1 / 45
About This Presentation
Title:

Three Brown Mice: See How They Run

Description:

A vivarium houses thousands of cages of mice. Manual, close monitoring of each mouse is impossible. Mouse Vivarium Room. Motivation ... – PowerPoint PPT presentation

Number of Views:41
Avg rating:3.0/5.0
Slides: 46
Provided by: kristinma
Category:
Tags: brown | mice | run | see | three | vivarium

less

Transcript and Presenter's Notes

Title: Three Brown Mice: See How They Run


1
Three Brown Mice See How They Run
  • Kristin Branson, Vincent Rabaud, and Serge
    Belongie
  • Dept of Computer Science, UCSD
  • http//vision.ucsd.edu

2
Problem
  • We wish to track three agouti mice from video of
    a side view of their cage.

3
Motivation
Mouse Vivarium Room
  • A vivarium houses thousands of cages of mice.
  • Manual, close monitoring of each mouse is
    impossible.

4
Motivation
Activity Eating Scratching Reproduction Rolling
Behavior Analysis Algorithm
  • Automated behavior analysis will allow for
  • Improved animal care.
  • More detailed and exact data collection.

5
Motivation
Activity Eating Scratching Reproduction Rolling
Behavior Analysis Algorithm
Tracking Algorithm
  • Automated behavior analysis will allow for
  • Improved animal care.
  • More detailed and exact data collection.
  • An algorithm that tracks individual mice is a
    necessity for automated behavior analysis.

6
A Unique Tracking Problem
  • Tracking multiple mice is difficult because
  • The mice are indistinguishable.
  • They are prone to occluding one another.
  • They have few (if any) trackable features.
  • Their motion is relatively erratic.

7
Simplifying Assumptions
  • We benefit from simplifying assumptions
  • The number of objects does not change.
  • The illumination is relatively constant.
  • The camera is stationary.

8
Tracking Subproblems
  • We break the tracking problem into parts
  • Track separated mice.
  • Detect occlusions.
  • Track occluded/occluding mice.

9
Tracking Subproblems
  • We break the tracking problem into parts
  • Track separated mice.
  • Detect occlusions.
  • Track occluded/occluding mice.

10
Tracking Subproblems
  • We break the tracking problem into parts
  • Track separated mice.
  • Detect occlusions.
  • Track occluded/occluding mice.

11
Tracking Subproblems
  • We break the tracking problem into parts
  • Track separated mice.
  • Detect occlusions.
  • Track occluded/occluding mice.

12
Tracking through Occlusion
  • Segmenting is more difficult when a frame is
    viewed out of context.

13
Tracking through Occlusion
  • Segmenting is more difficult when a frame is
    viewed out of context.
  • Using a depth ordering heuristic, we associate
    the mouse at the start of an occlusion with the
    mouse at the end of the occlusion.
  • We track mice sequentially through the occlusion,
    incorporating a hint of the future locations of
    the mice.

14
Outline
  • Background/Foreground classification.
  • Tracking separated mice.
  • Detecting occlusions.
  • Tracking through occlusion.
  • Experimental results.
  • Future work.

15
Outline
  • Background/Foreground classification.
  • Tracking separated mice.
  • Detecting occlusions.
  • Tracking through occlusion.
  • Experimental results.
  • Future work.

16
Background/Foreground
Current Frame
Thresholded Absolute Difference
Image History
Modified Temporal Median
Foreground/ Background Classification
Estimated Background
17
Outline
  • Background/Foreground classification.
  • Tracking separated mice.
  • Detecting occlusions.
  • Tracking through occlusion.
  • Experimental results.
  • Future work.

18
Tracking Separated Mice
  • We model the distribution of the pixel locations
    of each mouse as a bivariate Gaussian.
  • If the mice are separated, they can be modeled by
    a Mixture of Gaussians.
  • We fit the parameters using the EM algorithm.

mean
covariance
19
Outline
  • Background/Foreground classification.
  • Tracking separated mice.
  • Detecting occlusions.
  • Tracking through occlusion.
  • Experimental results.
  • Future work.

20
Detecting Occlusions
  • Occlusion events are detected using the GMM
    parameters.
  • We threshold how close together the mouse
    distributions are.
  • The Fisher distance in the x-direction is the
    distance measure

21
Outline
  • Background/Foreground classification.
  • Tracking separated mice.
  • Detecting occlusions.
  • Tracking through occlusion.
  • Experimental results.
  • Future work.

22
Tracking through Occlusion
  • The pixel memberships during occlusion events
    must be reassigned.

23
Best Affine Transformation
24
Affine Flow Assumptions
  • Affine flow estimation assumes
  • Brightness Constancy image brightness of an
    object does not change from frame to frame.
  • The per-frame motion of each mouse can be
    described by an affine transformation.

Frame t
Frame t1
25
Standard Affine Flow
  • In general, these assumptions do not hold.
  • We therefore minimize in the least-squares
    sense.
  • The best a given only the affine flow cue
    minimizes
  • where

26
Affine Flow Plus Prior
  • The affine flow cue alone does not give an
    accurate motion estimate.

27
Affine Flow Plus Prior
  • The affine flow cue alone does not give an
    accurate motion estimate.
  • Suppose we have a guess of the affine
    transformation, â, to bias our estimate.
  • The best a minimizes and is near the â.

28
Affine Flow Plus Prior
  • Our criterion is
  • Taking the partial derivative of Ha w.r.t. a,
    setting it to 0, and solving for a gives

regularization term
29
What is â?
  • We use the depth order cue to estimate â.
  • We assume the front blob at the start and end of
    the occlusion are the same mouse.

Start (frame 1)
End (frame n)
30
What is â?
  • The succession of frame to frame motions
    transforms the initial front mouse into the final
    front mouse.
  • We set the per-frame prior estimate â to reflect
    this.

1 2 3
4 n
31
Affine Interpolation
  • We estimate the frame to frame motion, â, by
    linearly interpolating the total motion,

1 2 3
4 n
32
Affine Interpolation
  • Given the initial and final mouse distributions,
    we compute the total transformation

Orthogonal Matrix
Rotation Skew
Translation
1
n
33
Affine Interpolation
  • From the total transformation, we estimate the
    per-frame transformation

34
Depth Order Heuristic
  • Estimating which mouse is in front relies on a
    simple heuristic the front mouse has the lowest
    (largest) y-coordinate.

y
x
35
Pixel Membership Estimation
36
Pixel Membership Estimation
  • We assign membership based on the weighted sum of
    the proximity and motion similarity.
  • Proximity criterion
  • Motion similarity criterion

Local optical flow estimate
37
Outline
  • Background/Foreground classification.
  • Tracking separated mice.
  • Detecting occlusions.
  • Tracking through occlusion.
  • Experimental results.
  • Future work.

38
Experimental Results
  • We report initial success of our algorithm in
    tracking three agouti mice in a cage.

39
Experimental Results
  • Viewed in another way

x
t
40
Conclusions
Simple Tracker
Mouse Positions
Video
Detect Occlusions
Occlusion Starts Ends
Occlusion Reasoning
Mouse Positions
  • We presented three modules to track identical,
    non-rigid, featureless objects through severe
    occlusion.
  • The novel module is the occlusion tracking module.

41
Conclusions
Simple Tracker
Mouse Positions
Video
Detect Occlusions
Occlusion Starts Ends
Occlusion Reasoning
Mouse Positions
  • We presented three modules to track identical,
    non-rigid, featureless objects through severe
    occlusion.
  • The novel module is the occlusion reasoning
    module.

42
Conclusions
Frame n
Frame 1
Frame 2
Frame 3
Frame 4
  • While the occlusion tracker operates
    sequentially, it incorporates a hint of the
    future locations of the mice.
  • This is a step in the direction of an algorithm
    that reasons forward and backward in time.

43
Future Work
  • More robust depth estimation.
  • More robust separated mouse tracking (e.g.
    BraMBLe).
  • Different affine interpolation schemes.

44
References
1 D. Comaniciu, V. Ramesh, and P. Meer.
Kernel-based object tracking. In Pattern Analysis
and Machine Intelligence, volume 25 (5),
2003. 2 J. GÃ¥rding. Shape from surface
markings. PhD thesis, Royal Institute of
Technology, Stockholm, 1991. 3 T. Hastie, R.
Tibshirani, and J. Friedman. The Elements of
Statistical Learning. Springer Series in
Statistics. Springer Verlag, Basel, 2001. 4 M.
Irani and P. Anandan. All about direct methods.
In Vision Algorithms Theory and Practice.
Springer-Verlag, 1999. 5 M. Isard and J.
MacCormick. BraMBLe A Bayesian multiple-blob
tracker. In ICCV, 2001. 6 B. Lucas and T.
Kanade. An iterative image registration technique
with an application to stereo vision. In DARPA
Image Understanding Workshop, 1984. 7 J.
MacCormick and A. Blake. A probabilistic
exclusion principle for tracking multiple
objects. IJCV, 39(1)5771, 2000.
45
References
8 Measuring Behavior Intl. Conference on
Methods and Techniques in Behavioral Research,
19962002. 9 S. Niyogi. Detecting kinetic
occlusion. In ICCV, pages 10441049, 1995. 10
J. Shi and C. Tomasi. Good features to track. In
CVPR, Seattle, June 1994. 11 H. Tao, H.
Sawhney, and R. Kumar. A sampling algorithm for
tracking multiple objects. In Workshop on Vision
Algorithms, pages 5368, 1999. 12 C. Twining,
C. Taylor, and P. Courtney. Robust tracking and
posture description for laboratory rodents using
active shape models. In Behavior Research
Methods, Instruments and Computers, Measuring
Behavior Special
Write a Comment
User Comments (0)
About PowerShow.com