Title: Three Brown Mice: See How They Run
1Three Brown Mice See How They Run
- Kristin Branson, Vincent Rabaud, and Serge
Belongie - Dept of Computer Science, UCSD
- http//vision.ucsd.edu
2Problem
- We wish to track three agouti mice from video of
a side view of their cage.
3Motivation
Mouse Vivarium Room
- A vivarium houses thousands of cages of mice.
- Manual, close monitoring of each mouse is
impossible.
4Motivation
Activity Eating Scratching Reproduction Rolling
Behavior Analysis Algorithm
- Automated behavior analysis will allow for
- Improved animal care.
- More detailed and exact data collection.
5Motivation
Activity Eating Scratching Reproduction Rolling
Behavior Analysis Algorithm
Tracking Algorithm
- Automated behavior analysis will allow for
- Improved animal care.
- More detailed and exact data collection.
- An algorithm that tracks individual mice is a
necessity for automated behavior analysis.
6A Unique Tracking Problem
- Tracking multiple mice is difficult because
- The mice are indistinguishable.
- They are prone to occluding one another.
- They have few (if any) trackable features.
- Their motion is relatively erratic.
7Simplifying Assumptions
- We benefit from simplifying assumptions
- The number of objects does not change.
- The illumination is relatively constant.
- The camera is stationary.
8Tracking Subproblems
- We break the tracking problem into parts
- Track separated mice.
- Detect occlusions.
- Track occluded/occluding mice.
9Tracking Subproblems
- We break the tracking problem into parts
- Track separated mice.
- Detect occlusions.
- Track occluded/occluding mice.
10Tracking Subproblems
- We break the tracking problem into parts
- Track separated mice.
- Detect occlusions.
- Track occluded/occluding mice.
11Tracking Subproblems
- We break the tracking problem into parts
- Track separated mice.
- Detect occlusions.
- Track occluded/occluding mice.
12Tracking through Occlusion
- Segmenting is more difficult when a frame is
viewed out of context.
13Tracking through Occlusion
- Segmenting is more difficult when a frame is
viewed out of context. - Using a depth ordering heuristic, we associate
the mouse at the start of an occlusion with the
mouse at the end of the occlusion. - We track mice sequentially through the occlusion,
incorporating a hint of the future locations of
the mice.
14Outline
- Background/Foreground classification.
- Tracking separated mice.
- Detecting occlusions.
- Tracking through occlusion.
- Experimental results.
- Future work.
15Outline
- Background/Foreground classification.
- Tracking separated mice.
- Detecting occlusions.
- Tracking through occlusion.
- Experimental results.
- Future work.
16Background/Foreground
Current Frame
Thresholded Absolute Difference
Image History
Modified Temporal Median
Foreground/ Background Classification
Estimated Background
17Outline
- Background/Foreground classification.
- Tracking separated mice.
- Detecting occlusions.
- Tracking through occlusion.
- Experimental results.
- Future work.
18Tracking Separated Mice
- We model the distribution of the pixel locations
of each mouse as a bivariate Gaussian. - If the mice are separated, they can be modeled by
a Mixture of Gaussians. - We fit the parameters using the EM algorithm.
mean
covariance
19Outline
- Background/Foreground classification.
- Tracking separated mice.
- Detecting occlusions.
- Tracking through occlusion.
- Experimental results.
- Future work.
20Detecting Occlusions
- Occlusion events are detected using the GMM
parameters. - We threshold how close together the mouse
distributions are. - The Fisher distance in the x-direction is the
distance measure
21Outline
- Background/Foreground classification.
- Tracking separated mice.
- Detecting occlusions.
- Tracking through occlusion.
- Experimental results.
- Future work.
22Tracking through Occlusion
- The pixel memberships during occlusion events
must be reassigned.
23Best Affine Transformation
24Affine Flow Assumptions
- Affine flow estimation assumes
- Brightness Constancy image brightness of an
object does not change from frame to frame. - The per-frame motion of each mouse can be
described by an affine transformation.
Frame t
Frame t1
25Standard Affine Flow
- In general, these assumptions do not hold.
- We therefore minimize in the least-squares
sense. - The best a given only the affine flow cue
minimizes - where
26Affine Flow Plus Prior
- The affine flow cue alone does not give an
accurate motion estimate.
27Affine Flow Plus Prior
- The affine flow cue alone does not give an
accurate motion estimate. - Suppose we have a guess of the affine
transformation, â, to bias our estimate. - The best a minimizes and is near the â.
28Affine Flow Plus Prior
- Our criterion is
- Taking the partial derivative of Ha w.r.t. a,
setting it to 0, and solving for a gives
regularization term
29What is â?
- We use the depth order cue to estimate â.
- We assume the front blob at the start and end of
the occlusion are the same mouse.
Start (frame 1)
End (frame n)
30What is â?
- The succession of frame to frame motions
transforms the initial front mouse into the final
front mouse. - We set the per-frame prior estimate â to reflect
this.
1 2 3
4 n
31Affine Interpolation
- We estimate the frame to frame motion, â, by
linearly interpolating the total motion,
1 2 3
4 n
32Affine Interpolation
- Given the initial and final mouse distributions,
we compute the total transformation
Orthogonal Matrix
Rotation Skew
Translation
1
n
33Affine Interpolation
- From the total transformation, we estimate the
per-frame transformation
34Depth Order Heuristic
- Estimating which mouse is in front relies on a
simple heuristic the front mouse has the lowest
(largest) y-coordinate.
y
x
35Pixel Membership Estimation
36Pixel Membership Estimation
- We assign membership based on the weighted sum of
the proximity and motion similarity. - Proximity criterion
- Motion similarity criterion
Local optical flow estimate
37Outline
- Background/Foreground classification.
- Tracking separated mice.
- Detecting occlusions.
- Tracking through occlusion.
- Experimental results.
- Future work.
38Experimental Results
- We report initial success of our algorithm in
tracking three agouti mice in a cage.
39Experimental Results
x
t
40Conclusions
Simple Tracker
Mouse Positions
Video
Detect Occlusions
Occlusion Starts Ends
Occlusion Reasoning
Mouse Positions
- We presented three modules to track identical,
non-rigid, featureless objects through severe
occlusion. - The novel module is the occlusion tracking module.
41Conclusions
Simple Tracker
Mouse Positions
Video
Detect Occlusions
Occlusion Starts Ends
Occlusion Reasoning
Mouse Positions
- We presented three modules to track identical,
non-rigid, featureless objects through severe
occlusion. - The novel module is the occlusion reasoning
module.
42Conclusions
Frame n
Frame 1
Frame 2
Frame 3
Frame 4
- While the occlusion tracker operates
sequentially, it incorporates a hint of the
future locations of the mice. - This is a step in the direction of an algorithm
that reasons forward and backward in time.
43Future Work
- More robust depth estimation.
- More robust separated mouse tracking (e.g.
BraMBLe). - Different affine interpolation schemes.
44References
1 D. Comaniciu, V. Ramesh, and P. Meer.
Kernel-based object tracking. In Pattern Analysis
and Machine Intelligence, volume 25 (5),
2003. 2 J. GÃ¥rding. Shape from surface
markings. PhD thesis, Royal Institute of
Technology, Stockholm, 1991. 3 T. Hastie, R.
Tibshirani, and J. Friedman. The Elements of
Statistical Learning. Springer Series in
Statistics. Springer Verlag, Basel, 2001. 4 M.
Irani and P. Anandan. All about direct methods.
In Vision Algorithms Theory and Practice.
Springer-Verlag, 1999. 5 M. Isard and J.
MacCormick. BraMBLe A Bayesian multiple-blob
tracker. In ICCV, 2001. 6 B. Lucas and T.
Kanade. An iterative image registration technique
with an application to stereo vision. In DARPA
Image Understanding Workshop, 1984. 7 J.
MacCormick and A. Blake. A probabilistic
exclusion principle for tracking multiple
objects. IJCV, 39(1)5771, 2000.
45References
8 Measuring Behavior Intl. Conference on
Methods and Techniques in Behavioral Research,
19962002. 9 S. Niyogi. Detecting kinetic
occlusion. In ICCV, pages 10441049, 1995. 10
J. Shi and C. Tomasi. Good features to track. In
CVPR, Seattle, June 1994. 11 H. Tao, H.
Sawhney, and R. Kumar. A sampling algorithm for
tracking multiple objects. In Workshop on Vision
Algorithms, pages 5368, 1999. 12 C. Twining,
C. Taylor, and P. Courtney. Robust tracking and
posture description for laboratory rodents using
active shape models. In Behavior Research
Methods, Instruments and Computers, Measuring
Behavior Special