Handdrawn graphics recognition with Bayes Conditional Networks - PowerPoint PPT Presentation

1 / 25
About This Presentation
Title:

Handdrawn graphics recognition with Bayes Conditional Networks

Description:

Handdrawn graphics recognition with Bayes Conditional Networks – PowerPoint PPT presentation

Number of Views:28
Avg rating:3.0/5.0
Slides: 26
Provided by: martins62
Category:

less

Transcript and Presenter's Notes

Title: Handdrawn graphics recognition with Bayes Conditional Networks


1
Hand-drawn graphics recognition with Bayes
Conditional Networks
  • Martin Szummer
  • CHIRP Project
  • Microsoft Research Cambridge

2
Background
  • There is more to ink understanding than
    handwritten text recognition!

3
Introduction
  • Bayes Conditional Networks a framework for joint
    pattern classification
  • illustrated on container / connector recognition
    in organization charts.
  • Motivation from product groups
  • Tablet PC platform Sashi Raghupathy
  • Infografix (IGX) Dan Albertson

4
Challenges
5
Previous Generative Approach
  • Shape recognition using generative models
    Krishnapuram, Bishop, Szummer

Bishop 03
6
Discriminative Approaches
  • For classification, model only the labels given
    the ink data avoid modelling the ink itself
  • e.g. support vector machines but these assume
    that all labels are independent!
  • Conditional Random Fields
  • model dependence of labels
  • applied to image classification Kumar and Hebert
    03, but for 2D problems, parameter estimation is
    problematic

7
Bayes Conditional Networks (BCN)
  • BCNs are undirected graphical models (CRFs)
    trained in a Bayesian way
  • BCN model the probability of the labels
    (connector/container) conditioned on features
    computed from all ink on the page
  • Flexible features overlapping, correlated

8
Approach
Input drawing with text removed TAB
demo Svensen, Bishop, Gangnet
9
1. Divide ink strokes into fragments
10
2. Construct a neighborhood graph on fragments
Potential functions measure compatibility of the
labels with the ink
ti
tj
q - trained parameters
11
3. Calculate features
box-full feature (strongest feature)
12
t-junction feature
13
Independent Classification
14
4. Jointly classify fragments as being part of a
container or connector
time 4 seconds (Matlab prototype)
15
BCN training/testing
  • Bayesian training of parameters q
  • uses the Expectation Propagation (EP) framework
    Minka 01
  • approximates factors term-by-term
  • combined with automatic relevance determination
    for feature selection
  • Inference (testing)
  • find most likely joint labeling (MAP)using
    junction tree algorithm

16
Ink Features
  • Unary features depend on the fragment and
    typically on its neighboring fragments
  • length, angle
  • histograms of distances and angles
  • templates t-junction, full-box
  • Pairwise features
  • combinations of unary features (, -, x)
  • features defined on two neighboring fragments
    relative angle, same-stroke

17
Feature Selection
fullbox
  • Unary
  • feature
  • weights
  • After
  • automatic
  • relevance
  • determination

tjunct
angle
length
18
TAB members
19
TAB members
20
TAB members
21
TAB members
22
TAB members
23
Benchmark
  • Data 17 subjects draw given organization charts
    on a TabletPC device

24
Future Directions
  • Recognition of complex objects
  • segmentation
  • hierarchical classification
  • Design an architecture for a complex ink parsing
    system
  • representing uncertainty and multiple hypotheses
    between different modules of the system
  • Other applications of BCNs
  • Image classification and segmentation
  • Text analysis, information extraction
  • Fax and scanned document analysis

25
Acknowledgements
  • Yuan Qi
  • Michel Gangnet
  • Christopher Bishop
  • Geoffrey Hinton
Write a Comment
User Comments (0)
About PowerShow.com