MaxMargin Matching for Semantic Role Labeling - PowerPoint PPT Presentation

1 / 11
About This Presentation
Title:

MaxMargin Matching for Semantic Role Labeling

Description:

Our idea: use a word hierarchy to find categories of words. Used WordNet as the hierarchy ... For each possible argument (Giver, Gift, Recipient, etc. ... – PowerPoint PPT presentation

Number of Views:48
Avg rating:3.0/5.0
Slides: 12
Provided by: simon82
Category:

less

Transcript and Presenter's Notes

Title: MaxMargin Matching for Semantic Role Labeling


1
Max-Margin Matching for Semantic Role Labeling
  • David Vickrey
  • James Connor
  • Daphne Koller
  • Stanford University

2
Overview
  • We consider two complementary models for the same
    task
  • In our case, one discriminative, one generative
  • Combine the models by feeding output predictions
    into a discriminative classifier
  • Try two different classifiers for combining
    multi-class SVM and max-margin matching

3
Semantic Role Labeling
  • Label the arguments of a verb in context

I gave the dog a bone.
Giver
Recipient
Gift
  • PropBank 1m labeled words (Wall Street Journal)

4
Syntactic Model
  • Most useful features
  • Argument word/part of speech
  • Path from argument to verb in parse tree

S
VP
NP
NP
NP
VP
I
gave
dog
bone
  • Use a standard classifier, e.g. SVM
  • One vs. All classifier for each possible argument
    type
  • Trained across all verbs at once

5
Semantic Model
  • Data set words occurring as Eater for verb eat
  • dog, cat, he,
  • Usually, will either be a person or an animal
  • We want to generalize to unseen animals or people
  • Google Sets problem
  • Our idea use a word hierarchy to find categories
    of words
  • Used WordNet as the hierarchy
  • Selected categories using Bayesian score
  • Will be presented tomorrow
  • Bayesian Methods for Natural Language Processing
  • On its own, improves log-likelihood of test sets
    on PropBank
  • Train one model for each argument for each verb

6
Combining Models
  • For each word w in a sentence with verb v
  • For each possible argument (Giver, Gift,
    Recipient, etc.)
  • Margin of One vs. All classifier using syntactic
    features
  • log Pv(Arg w) using semantic model trained on
    Arg and verb v
  • Use these as inputs to multi-class SVM
  • One weight for each argument for each model (not
    specific to v)

I wanted my dog to eat the pickle.
Syntax
Semantic
Dog?
Eater 1.0a-0.29b
Food 1.2c-1.39d
7
Results
  • Tested on first 500 frames ( ½ of the data)

8
Max-Margin Matching
  • Each argument should be assigned to only one word
  • Complete bipartite graph
  • Weight of edge from a word w to argument a
  • Syntax only Margin of classifier for a applied
    to w
  • Syntax and Semantic (weighted) sum of
    confidences of each
  • Same set of weights as in Multi-class SVM
  • Apply max-margin matching learning for these
    weights

I
1.0
1.0a-0.29b
Eater
dog
Food
pickle
1.2c-1.39d
1.2
9
Results
  • Can do matching for any type of training of
    weights

10
Results Summary
  • Classifying using confidences of One vs. All can
    help
  • Improving One vs. All classifier may remove this
    benefit
  • Combining the models in a classifier worked
  • Able to improve using WordNet
  • However, only worked with both matching and
    training!
  • Matching helped, but training the max-margin
    matching did not
  • Why?
  • Not that much data (shared weights across all
    verbs)
  • Not that many parameters

11
Future Directions
  • Previous work used a Markov random field over
    classification decisions
  • Cant do exact inference
  • Can include potentials besides one word per
    argument
  • We could try to extend max-margin matching
  • Unlabeled data
  • Bootstrap between different classifiers
  • When to include an example?
  • High confidence under a single classifier?
  • Or, high confidence in combined classifier?

Toutanova, Haghighi, Manning, ACL 2005
Write a Comment
User Comments (0)
About PowerShow.com