UMD Evaluation of Object Tracking in Video - PowerPoint PPT Presentation

About This Presentation
Title:

UMD Evaluation of Object Tracking in Video

Description:

UMD Evaluation of Object Tracking in Video. University of Maryland. VACE Project ... For custom graphs, you have to do it yourself. ROC Curves. Scatter Plots ... – PowerPoint PPT presentation

Number of Views:11
Avg rating:3.0/5.0
Slides: 22
Provided by: DavidMi86
Category:

less

Transcript and Presenter's Notes

Title: UMD Evaluation of Object Tracking in Video


1
UMD Evaluation of Object Tracking in Video
  • University of Maryland

2
VACE Project
  • Multiple teams from different research groups
    present algorithms for different video problems.
  • Evaluation is handled by several teams
  • Penn State devises metrics and runs evaluations.
  • UMD writes software and authors ground truth.
  • ViPER

3
ViPER Performance Evaluation
  • Java program for support at UMD.
  • Work started in mid-1990s.
  • Modified to support Penn State metrics in 2000.

4
Penn State Frame Evaluations
  • Look at the results for each frame, one at a
    time.
  • For each frame, do some set of evaluation
    metrics. These include
  • Object count precision and recall.
  • Pixel precision and recall over all objects in
    frame.
  • Individual object pixel precision and recall
    measures.

5
Penn State Tracking Retrieval
  • Assumes matching has already been accomplished.
  • First Frame
  • For the life of the object, calculate some set of
    metrics.
  • Gets a set of distances for each frame.
  • Can display as a line graph with time.
  • Can get an average for each distance.

6
But
  • Frame metrics throw away tracking information.
  • Tracking metrics require a known matching.
  • This constraint alters the problem.
  • Even with the known matching, does not handle
    tracking adequately, to include things like
    confusion and occlusion.
  • The results, as described, are just sums over all
    frames.
  • There is no unified metric across time and space.

7
UMD Maximal Optimal Matching
  • Score for each possible object match.
  • Find the optimal matching.
  • One-to-one Match Get the list of pairs that
    minimize the total distance over all possible
    matchings.
  • Multiple Match For each disjoint subset of truth
    objects, get the disjoint subset of output
    objects that minimizes the total overall
    distance.
  • Also get precision and recall.
  • For S size of matching
  • Precision S / size(candidates)
  • Recall S / size(targets)

8
UMD Maximal Optimal Matching
  • Takes into account both space and time.
  • Can be generalized to make no assumptions about
    space and time.
  • Optimal 1-1 matching has many nice properties.
  • Can handle many-to-many matching.

9
Experimental Results
10
Example Tracking Text Object
11
Example Tracking Text Frame
12
Example Tracking Text Tracking
13
Example Person Tracking Object
14
Example Person Tracking Frame
15
Fin
  • Dr. David Doermann
  • Dr. Rangachar Kasturi
  • David Mihalcik
  • Ilya Makedon
  • many others

16
Tracking Graphs
17
Object Level Matching
  • Most obvious solution many-many matching.
  • Allows matching on any data type, at a price.

18
Pixel-Frame-Box Metrics
  • Look at each frame and ask a specific question
    about its contents.
  • Number of pixels correctly matched.
  • Number of boxes that have some overlap.
  • Or overlap greater than some threshold.
  • How many boxes overlap a given box?
    (Fragmentation)
  • Look at all frames and ask a question
  • Number of frames correctly detected.
  • Proper number of objects counted.

19
Individual Box Tracking Metrics
  • Mostly useful for the retrieval problem, this
    solution looks at pairs of ground truth boxes and
    a result box.
  • Metrics are
  • Position
  • Size
  • Orientation

20
Questions Ignoring Ground Truth
  • Assume the evaluation routine is given a set of
    objects to ignore (or rules for determining what
    type of object to ignore). How does this effect
    the output?
  • For pixel measures, just dont count pixels on
    ignored regions. This works for Tracking and
    Frame evaluations.
  • For object matches, do the complete match when
    finished, ignore result data that matches ignored
    truth.

21
Questions Presenting the Results
  • Have some basic built in graphs.
  • Line graphs for individual metrics
  • Bar charts showing several metrics
  • For custom graphs, you have to do it yourself.
  • ROC Curves
  • Scatter Plots
Write a Comment
User Comments (0)
About PowerShow.com