Putting Meaning Into Your Trees - PowerPoint PPT Presentation

1 / 73
About This Presentation
Title:

Putting Meaning Into Your Trees

Description:

Putting Meaning Into Your Trees Martha Palmer University of Pennsylvania Columbia University New York City January 29, 2004 Outline Introduction Background: WordNet ... – PowerPoint PPT presentation

Number of Views:39
Avg rating:3.0/5.0
Slides: 74
Provided by: csColumb50
Category:

less

Transcript and Presenter's Notes

Title: Putting Meaning Into Your Trees


1
Putting Meaning Into Your Trees
  • Martha Palmer
  • University of Pennsylvania
  • Columbia University
  • New York City
  • January 29, 2004

2
Outline
  • Introduction
  • Background WordNet, Levin classes, VerbNet
  • Proposition Bank capturing shallow semantics
  • Mapping PropBank to VerbNet
  • Mapping PropBank to WordNet

3
Ask Jeeves A Q/A, IR ex.
  • What do you call a successful movie?
  • Tips on Being a Successful Movie Vampire ... I
    shall call the police.
  • Successful Casting Call Shoot for Clash of
    Empires'' ... thank everyone for their
    participation in the making of yesterday's movie.
  • Demme's casting is also highly entertaining,
    although I wouldn't go so far as to call it
    successful. This movie's resemblance to its
    predecessor is pretty vague...
  • VHS Movies Successful Cold Call Selling Over
    100 New Ideas, Scripts, and Examples from the
    Nation's Foremost Sales Trainer.

Blockbuster
4
Ask Jeeves filtering w/ POS tag
  • What do you call a successful movie?
  • Tips on Being a Successful Movie Vampire ... I
    shall call the police.
  • Successful Casting Call Shoot for Clash of
    Empires'' ... thank everyone for their
    participation in the making of yesterday's movie.
  • Demme's casting is also highly entertaining,
    although I wouldn't go so far as to call it
    successful. This movie's resemblance to its
    predecessor is pretty vague...
  • VHS Movies Successful Cold Call Selling Over
    100 New Ideas, Scripts, and Examples from the
    Nation's Foremost Sales Trainer.

5
Filtering out call the police
Syntax
call(you,movie,what) ?
call(you,police)
6
English lexical resource is required
  • That provides sets of possible syntactic frames
    for verbs.
  • And provides clear, replicable sense
    distinctions.
  • AskJeeves Who do you call for a good electronic
    lexical database for English?

7
WordNet call, 28 senses
  • name, call -- (assign a specified, proper name
    to
  • "They named their son David" )
  • -gt LABEL
  • 2. call, telephone, call up, phone, ring -- (get
    or try to get into communication (with someone)
    by telephone
  • "I tried to call you all night" )
  • -gtTELECOMMUNICATE
  • 3. call -- (ascribe a quality to or give a name
    of a common noun that reflects a quality
  • "He called me a bastard" )
  • -gt LABEL
  • 4. call, send for -- (order, request, or command
    to come
  • "She was called into the director's office"
    "Call the police!")
  • -gt ORDER

8
WordNet Princeton (Miller 1985, Fellbaum 1998)
  • On-line lexical reference (dictionary)
  • Nouns, verbs, adjectives, and adverbs grouped
    into synonym sets
  • Other relations include hypernyms (ISA),
    antonyms, meronyms
  • Limitations as a computational lexicon
  • Contains little syntactic information
  • No explicit predicate argument structures
  • No systematic extension of basic senses
  • Sense distinctions are very fine-grained, ITA 73
  • No hierarchical entries

9
Levin classes (Levin, 1993)
  • 3100 verbs, 47 top level classes, 193 second and
    third level
  • Each class has a syntactic signature based on
    alternations.
  • John broke the jar. / The jar broke. /
    Jars break easily.
  • John cut the bread. / The bread cut. /
    Bread cuts easily.
  • John hit the wall. / The wall hit. /
    Walls hit easily.

10
Levin classes (Levin, 1993)
  • Verb class hierarchy 3100 verbs, 47 top level
    classes, 193
  • Each class has a syntactic signature based on
    alternations.
  • John broke the jar. / The jar broke. /
    Jars break easily.
  • change-of-state
  • John cut the bread. / The bread cut. /
    Bread cuts easily.
  • change-of-state, recognizable
    action,
  • sharp instrument
  • John hit the wall. / The wall hit. /
    Walls hit easily.
  • contact, exertion of force

11
(No Transcript)
12
Confusions in Levin classes?
  • Not semantically homogenous
  • braid, clip, file, powder, pluck, etc...
  • Multiple class listings
  • homonymy or polysemy?
  • Conflicting alternations?
  • Carry verbs disallow the Conative,
  • (she carried at the ball), but include
  • push,pull,shove,kick,draw,yank,tug
  • also in Push/pull class, does take the Conative
    (she kicked at the ball)

13
Intersective Levin Classes
apart CH-STATE
across the room CH-LOC
at CH-LOC
Dang, Kipper Palmer, ACL98
14
Intersective Levin Classes
  • More syntactically and semantically coherent
  • sets of syntactic patterns
  • explicit semantic components
  • relations between senses
  • VERBNET
  • www.cis.upenn.edu/verbnet

Dang, Kipper Palmer, IJCAI00, Coling00
15
VerbNet Karin Kipper
  • Class entries
  • Capture generalizations about verb behavior
  • Organized hierarchically
  • Members have common semantic elements, semantic
    roles and syntactic frames
  • Verb entries
  • Refer to a set of classes (different senses)
  • each class member linked to WN synset(s) (not
    all WN senses are covered)

16
Semantic role labels
  • Julia broke the LCD projector.
  • break (agent(Julia), patient(LCD-projector))
  • cause(agent(Julia),
  • broken(LCD-projector))

agent(A) -gt intentional(A), sentient(A),
causer(A), affector(A) patient(P) -gt affected(P),
change(P),
17
Hand built resources vs. Real data
  • VerbNet is based on linguistic theory
  • how useful is it?
  • How well does it correspond to syntactic
    variations found in naturally occurring text?

18
Proposition BankFrom Sentences to Propositions
meet(Somebody1, Somebody2)
. . .
When Powell met Zhu Rongji on Thursday they
discussed the return of the spy
plane. meet(Powell, Zhu) discuss(Powell,
Zhu, return(X, plane))
19
Capturing semantic roles
SUBJ
  • Owen broke ARG1 the laser pointer.
  • ARG1 The windows were broken by the hurricane.
  • ARG1 The vase broke into pieces when it toppled
    over.

SUBJ
SUBJ
See also Framenet, http//www.icsi.berkeley.edu/
framenet/
20
English lexical resource is required
  • That provides sets of possible syntactic frames
    for verbs with semantic role labels.
  • And provides clear, replicable sense
    distinctions.

21
A TreeBanked Sentence
S
VP
NP-SBJ
Analysts
NP
S
VP
NP-SBJ
T-1
would
NP
PP-LOC
22
The same sentence, PropBanked
have been expecting
Arg1
Arg0
Analysts
23
Frames File Example expect
Roles Arg0 expecter Arg1 thing
expected Example Transitive, active
Portfolio managers expect further declines in
interest rates. Arg0
Portfolio managers REL
expect Arg1 further
declines in interest rates
24
Frames File example give
  • Roles
  • Arg0 giver
  • Arg1 thing given
  • Arg2 entity given to
  • Example double object
  • The executives gave the chefs a standing
    ovation.
  • Arg0 The executives
  • REL gave
  • Arg2 the chefs
  • Arg1 a standing
    ovation

25
Word Senses in PropBank
  • Orders to ignore word sense not feasible for 700
    verbs
  • Mary left the room
  • Mary left her daughter-in-law her pearls in her
    will
  • Frameset leave.01 "move away from"
  • Arg0 entity leaving
  • Arg1 place left
  • Frameset leave.02 "give"
  • Arg0 giver
  • Arg1 thing given
  • Arg2 beneficiary
  • How do these relate to traditional word senses in
    VerbNet and WordNet?

26
Annotation procedure
  • PTB II - Extraction of all sentences with given
    verb
  • Create Frame File for that verb Paul Kingsbury
  • (3100 lemmas, 4400 framesets,118K predicates)
  • Over 300 created automatically via VerbNet
  • First pass Automatic tagging (Joseph
    Rosenzweig)
  • http//www.cis.upenn.edu/josephr/TIDES/index.html
    lexicon
  • Second pass Double blind hand correction

  • Paul Kingsbury
  • Tagging tool highlights discrepancies Scott
    Cotton
  • Third pass Solomonization (adjudication)
  • Betsy Klipple, Olga Babko-Malaya

27
Trends in Argument Numbering
  • Arg0 agent
  • Arg1 direct object / theme / patient
  • Arg2 indirect object / benefactive / instrument
    / attribute / end state
  • Arg3 start point / benefactive / instrument /
    attribute
  • Arg4 end point
  • Per word vs frame level more general?

28
Additional tags (arguments or adjuncts?)
  • Variety of ArgMs (Arggt4)
  • TMP - when?
  • LOC - where at?
  • DIR - where to?
  • MNR - how?
  • PRP -why?
  • REC - himself, themselves, each other
  • PRD -this argument refers to or modifies another
  • ADV others

29
Inflection
  • Verbs also marked for tense/aspect
  • Passive/Active
  • Perfect/Progressive
  • Third singular (is has does was)
  • Present/Past/Future
  • Infinitives/Participles/Gerunds/Finites
  • Modals and negations marked as ArgMs

30
Frames Multiple Framesets
  • Out of the 787 most frequent verbs
  • 1 Frameset 521
  • 2 Frameset 169
  • 3 Frameset - 97 (includes light verbs)
  • 94 ITA
  • Framesets are not necessarily consistent between
    different senses of the same verb
  • Framesets are consistent between different verbs
    that share similar argument structures,
    (like FrameNet)

31
Ergative/Unaccusative Verbs
  • Roles (no ARG0 for unaccusative verbs)
  • Arg1 Logical subject, patient, thing rising
  • Arg2 EXT, amount risen
  • Arg3 start point
  • Arg4 end point
  • Sales rose 4 to 3.28 billion from 3.16 billion.

The Nasdaq composite index added 1.01 to
456.6 on paltry volume.
32
Actual data for leave
  • http//www.cs.rochester.edu/gildea/PropBank/Sort/
  • Leave .01 move away from Arg0 rel Arg1 Arg3
  • Leave .02 give Arg0 rel Arg1 Arg2
  • sub-ARG0 obj-ARG1 44
  • sub-ARG0 20
  • sub-ARG0 NP-ARG1-with obj-ARG2 17
  • sub-ARG0 sub-ARG2 ADJP-ARG3-PRD 10
  • sub-ARG0 sub-ARG1 ADJP-ARG3-PRD 6
  • sub-ARG0 sub-ARG1 VP-ARG3-PRD 5
  • NP-ARG1-with obj-ARG2 4
  • obj-ARG1 3
  • sub-ARG0 sub-ARG2 VP-ARG3-PRD 3

33
PropBank/FrameNet
Buy Arg0 buyer Arg1 goods Arg2
seller Arg3 rate Arg4 payment
Sell Arg0 seller Arg1 goods Arg2
buyer Arg3 rate Arg4 payment
Broader, more neutral, more syntactic maps
readily to VN,TR.FN Rambow, et al,
PMLB03
34
Annotator accuracy ITA 84
35
English lexical resource is required
  • That provides sets of possible syntactic frames
    for verbs with semantic role labels?
  • And provides clear, replicable sense
    distinctions.

36
English lexical resource is required
  • That provides sets of possible syntactic frames
    for verbs with semantic role labels
  • that can be automatically assigned accurately
    to new text?
  • And provides clear, replicable sense
    distinctions.

37
Automatic Labelling of Semantic Relations
  • Stochastic Model
  • Features
  • Predicate
  • Phrase Type
  • Parse Tree Path
  • Position (Before/after predicate)
  • Voice (active/passive)
  • Head Word

Gildea Jurafsky, CL02, Gildea Palmer, ACL02
38
Semantic Role Labelling Accuracy-Known Boundaries
  • Accuracy of semantic role prediction for known
    boundaries--the
  • system is given the
    constituents to classify.
  • FrameNet examples (training/test) are handpicked
    to be unambiguous.
  • Lower performance with unknown boundaries.
  • Higher performance with traces.
  • Almost evens out.

39
Additional Automatic Role Labelers
  • Performance improved from 77 to 88 Colorado
  • (Gold Standard parses, lt 10 instances)
  • Same features plus
  • Named Entity tags
  • Head word POS
  • For unseen verbs backoff to automatic verb
    clusters
  • SVMs
  • Role or not role
  • For each likely role, for each Arg, Arg or not
  • No overlapping role labels allowed

Pradhan, et. al., ICDM03, Sardeneau, et. al,
ACL03, Chen Rambow, EMNLP03, Gildea
Hockemaier, EMNLP03
40
Additional Automatic Role Labelers
  • Performance improved from 77 to 88 Colorado
  • New results, original features, labels, 88, 93
    Penn
  • (Gold Standard parses, lt 10 instances)
  • Same features plus
  • Named Entity tags
  • Head word POS
  • For unseen verbs backoff to automatic verb
    clusters
  • SVMs
  • Role or not role
  • For each likely role, for each Arg, Arg or not
  • No overlapping role labels allowed

Pradhan, et. al., ICDM03, Sardeneau, et. al,
ACL03, Chen Rambow, EMNLP03, Gildea
Hockemaier, EMNLP03
41
Word Senses in PropBank
  • Orders to ignore word sense not feasible for 700
    verbs
  • Mary left the room
  • Mary left her daughter-in-law her pearls in her
    will
  • Frameset leave.01 "move away from"
  • Arg0 entity leaving
  • Arg1 place left
  • Frameset leave.02 "give"
  • Arg0 giver
  • Arg1 thing given
  • Arg2 beneficiary
  • How do these relate to traditional word senses in
    VerbNet and WordNet?

42
Mapping from PropBank to VerbNet
43
Mapping from PB to VerbNet
44
Mapping from PropBank to VerbNet
  • Overlap with PropBank framesets
  • 50,000 PropBank instances
  • lt 50 VN entries, gt 85 VN classes
  • Results
  • MATCH - 78.63. (80.90 relaxed)
  • (VerbNet isnt just linguistic theory!)
  • Benefits
  • Thematic role labels and semantic predicates
  • Can extend PropBank coverage with VerbNet classes
  • WordNet sense tags
  • Kingsbury Kipper, NAACL03, Text Meaning
    Workshop
  • http//www.cs.rochester.edu/gildea/VerbNet/

45
WordNet as a WSD sense inventory
  • Senses unnecessarily fine-grained?
  • Word Sense Disambiguation bakeoffs
  • Senseval1 Hector, ITA 95.5
  • Senseval2 WordNet 1.7, ITA verbs 71
  • Groupings of Senseval2 verbs, ITA 82
  • Used syntactic and semantic criteria

46
Groupings Methodology(w/ Dang and Fellbaum)
  • Double blind groupings, adjudication
  • Syntactic Criteria (VerbNet was useful)
  • Distinct subcategorization frames
  • call him a bastard
  • call him a taxi
  • Recognizable alternations regular sense
    extensions
  • play an instrument
  • play a song
  • play a melody on an instrument

SIGLEX01, SIGLEX02, JNLE04
47
Groupings Methodology (cont.)
  • Semantic Criteria
  • Differences in semantic classes of arguments
  • Abstract/concrete, human/animal,
    animate/inanimate, different instrument types,
  • Differences in entailments
  • Change of prior entity or creation of a new
    entity?
  • Differences in types of events
  • Abstract/concrete/mental/emotional/.
  • Specialized subject domains

48
Results averaged over 28 verbs
Dang and Palmer, Siglex02,Dang et al,Coling02
MX Maximum Entropy WSD, p(sensecontext) Feature
s topic, syntactic constituents, semantic
classes 2.5, 1.5 to
5, 6
49
Grouping improved ITA and Maxent WSD
  • Call 31 of errors due to confusion between
    senses within same group 1
  • name, call -- (assign a specified, proper name
    to They named their son David)
  • call -- (ascribe a quality to or give a name of a
    common noun that reflects a quality He called me
    a bastard)
  • call -- (consider or regard as beingI would not
    call her beautiful)
  • 75 with training and testing on grouped senses
    vs.
  • 43 with training and testing on fine-grained
    senses

50
WordNet - call, 28 senses, groups

WN5, WN16,WN12 WN15
WN26 WN3 WN19 WN4 WN 7
WN8 WN9 WN1 WN22 WN20
WN25 WN18 WN27 WN2 WN 13 WN6 WN23
WN28 WN17 , WN 11 WN10, WN14,
WN21, WN24,
Loud cry
Bird or animal cry
Request
Label
Call a loan/bond
Challenge
Visit
Phone/radio
Bid
51
WordNet - call, 28 senses, groups

WN5, WN16,WN12 WN15
WN26 WN3 WN19 WN4 WN 7
WN8 WN9 WN1 WN22 WN20
WN25 WN18 WN27 WN2 WN 13 WN6 WN23
WN28 WN17 , WN 11 WN10, WN14,
WN21, WN24,
Loud cry
Bird or animal cry
Request
Label
Call a loan/bond
Challenge
Visit
Phone/radio
Bid
52
Overlap between Groups and Framesets 95

Frameset2
Frameset1
WN1 WN2 WN3 WN4 WN6 WN7 WN8
WN5 WN 9 WN10 WN11 WN12 WN13
WN 14 WN19 WN20
develop
Palmer, Dang Fellbaum, NLE 2004
53
Sense Hierarchy
  • PropBank Framesets
  • coarse grained distinctions
  • 20 Senseval 2 verbs w/ gt 1 Frameset
  • Maxent WSD system, 73.5 baseline, 90 accuracy
  • Sense Groups (Senseval-2) intermediate level
  • (includes Levin classes) 95 overlap, 69
  • WordNet fine grained distinctions, 60.2

54
English lexical resource is available
  • That provides sets of possible syntactic frames
    for verbs with semantic role labels that can be
    automatically assigned accurately to new text.
  • And provides clear, replicable sense
    distinctions.

55
A Chinese Treebank Sentence
  • ??/Congress ??/recently ??/pass ?/ASP ???/banking
    law
  • The Congress passed the banking law recently.
  • (IP (NP-SBJ (NN ??/Congress))
  • (VP (ADVP (ADV ??/recently))
  • (VP (VV ??/pass)
  • (AS ?/ASP)
  • (NP-OBJ (NN ???/banking
    law)))))

56
The Same Sentence, PropBanked
  • (IP (NP-SBJ arg0 (NN ??))
  • (VP argM (ADVP (ADV ??))
  • (VP f2 (VV ??)
  • (AS ?)
  • arg1 (NP-OBJ (NN ???)))))
  • ??(f2) (pass)
  • arg0 argM arg1
  • ?? ?? ??? (law)
  • (congress)

57
Chinese PropBank Status - (w/ Bert Xue and Scott
Cotton)
  • Create Frame File for that verb -
  • Similar alternations causative/inchoative,
    unexpressed object
  • 5000 lemmas, 3000 DONE, (hired Jiang)
  • First pass Automatic tagging 2500 DONE
  • Subcat frame matcher (Xue Kulick, MT03)
  • Second pass Double blind hand correction
  • In progress (includes frameset tagging), 1000
    DONE
  • Ported RATS to CATS, in use since May
  • Third pass Solomonization (adjudication)

58
A Korean Treebank Sentence
?? ??? 3 ???? ???? ??? ?? ??? ????.
He added that Renault has a deadline until the
end of March for a merger proposal.
  • (S (NP-SBJ ?/NPN?/PAU)
  • (VP (S-COMP (NP-SBJ ??/NPR?/PCA)
  • (VP (VP (NP-ADV
    3/NNU

  • ?/NNX?/NNX??/PAU)
  • (VP
    (NP-OBJ ??/NNC??/NNC

  • ??/NNC?/PCA)

  • ?/VV?/ECS))

  • ?/VX?/EFN?/PAD)
  • ???/VV?/EPF?/EFN)
  • ./SFN)

59
The same sentence, PropBanked
(S Arg0 (NP-SBJ ?/NPN?/PAU) (VP Arg2
(S-COMP ( Arg0 NP-SBJ ??/NPR?/PCA)
(VP (VP ( ArgM NP-ADV
3/NNU
?/NNX?/NNX??/PAU)
(VP ( Arg1
NP-OBJ ??/NNC??/NNC

??/NNC?/PCA)
?/VV?/ECS))
?/VX?/EFN?/PAD)
???/VV?/EPF?/EFN) ./SFN)
?????
Arg0
Arg2
??
?? ??
Arg0
Arg1
ArgM
???
???? ???
3 ????
???? (??, ??? 3 ???? ???? ??? ?? ??) (add)
(he) (Renaut has a deadline until
the end of March for a merger proposal) ?? (???,
3 ????, ???? ???) (has)
(Renaut) (until the end of March) (a deadline
for a merger proposal)
60
PropBank II
  • Nominalizations NYU
  • Lexical Frames DONE
  • Event Variables, (including temporals and
    locatives)
  • More fine-grained sense tagging
  • Tagging nominalizations w/ WordNet sense
  • Selected verbs and nouns
  • Nominal Coreference
  • not names
  • Clausal Discourse connectives selected subset

61
PropBank I
I
Event variables
nominal reference
  • Also, Arg0substantially lower Dutch corporate
    tax rates helped Arg1Arg0 the company keep
    Arg1 its tax outlay Arg3-PRD flat ArgM-ADV
    relative to earnings growth.

ArgM-ADV
Arg3-PRD
Arg1
Arg0
REL
the company keep its tax outlay flat
tax rates
help
relative to earnings
flat
its tax outlay
the company
keep
62
Summary
  • Shallow semantic annotation that captures
    critical dependencies and semantic role labels
  • Supports training of supervised automatic taggers
  • Methodology ports readily to other languages
  • English PropBank release spring 2004
  • Chinese PropBank release fall 2004
  • Korean PropBank release summer 2005

63
Word sense in Machine Translation
  • Different syntactic frames
  • John left the room
  • Juan saiu do quarto. (Portuguese)
  • John left the book on the table.
  • Juan deizou o livro na mesa.
  • Same syntactic frame?
  • John left a fortune.
  • Juan deixou uma fortuna.

64
Summary of Multilingual TreeBanks, PropBanks
65
Levin class escape-51.1-1
  • WordNet Senses WN 1, 5, 8
  • Thematic Roles Locationconcrete

  • Themeconcrete
  • Frames with Semantics
  • Basic Intransitive "The convict escaped"
    motion(during(E),Theme) direction(during(E),Prep,
    Theme, Location) Intransitive ( path PP)
    "The convict escaped from the prison"
  • Locative Preposition Drop "The convict escaped
    the prison"

66
Levin class future_having-13.3
  • WordNet Senses WN 2,10,13
  • Thematic Roles Agentanimate OR organization

  • Recipientanimate OR organization
  • Theme
  • Frames with Semantics
  • Dative "I promised somebody my time"
    Agent V Recipient Theme has_possession(start(E),
    Agent,Theme) future_possession(end(E),Recipient,Th
    eme) cause(Agent,E) Transitive ( Recipient PP)
    "We offered our paycheck to her" Agent V Theme
    Prep(to) Recipient ) Transitive (Theme Object)
    "I promised my house (to somebody)" Agent V
    Theme

67
Automatic classification
  • Merlo Stevenson automatically classified 59
    verbs with 69.8 accuracy
  • 1. Unergative, 2. unaccusative, 3. object-drop
  • 100M words automatically parsed
  • C5.0. Using features transitivity, causativity,
  • animacy,
    voice, POS
  • EM clustering 61, 2669 instances, 1M words
  • Using Gold Standard semantic role labels
  • 1. float hop/hope jump march leap
  • 2. change clear collapse cool crack open flood
  • 3. borrow clean inherit reap organize study

68
SENSEVAL Word Sense Disambiguation Evaluation
DARPA style bakeoff training data, testing data,
scoring algorithm.
NLE99, CHUM01, NLE02, NLE03
69
Maximum Entropy WSDHoa Dang, best performer on
Verbs
  • Maximum entropy framework, p(sensecontext)
  • Contextual Linguistic Features
  • Topical feature for W
  • keywords (determined automatically)
  • Local syntactic features for W
  • presence of subject, complements, passive?
  • words in subject, complement positions,
    particles, preps, etc.
  • Local semantic features for W
  • Semantic class info from WordNet (synsets, etc.)
  • Named Entity tag (PERSON, LOCATION,..) for proper
    Ns
  • words within /- 2 word window

70
Best Verb Performance - Maxent-WSD Hoa Dang
MX Maximum Entropy WSD, p(sensecontext) Feature
s topic, syntactic constituents, semantic
classes 2.5, 1.5 to
5, 6
Dang and Palmer, Siglex02,Dang et al,Coling02
71
Role Labels Framesetsas features for WSD
  • Preliminary results
  • Jinying Chen
  • Gold Standard PropBank annotation
  • Decision Tree C5.0,
  • Groups
  • 5 verbs,
  • Features Frameset tags, Arg labels
  • Comparable results to Maxent with PropBank
    features
  • Syntactic frames and sense distinctions are
    inseparable

72
Lexical resources provide concrete criteria for
sense distinctions
  • PropBank coarse grained sense distinctions
    determined by different subcategorization frames
    (Framesets)
  • Intersective Levin classes regular sense
    extensions through differing syntactic
    constructions
  • VerbNet distinct semantic predicates for each
    sense (verb class)

Are these the right distinctions?
73
Results averaged over 28 verbs
Write a Comment
User Comments (0)
About PowerShow.com