Computing Textual Inferences - PowerPoint PPT Presentation

About This Presentation
Title:

Computing Textual Inferences

Description:

Title: A Computational Semantics for Implicatives Author: PARC User Last modified by: PARC User Created Date: 3/10/2006 4:03:55 PM Document presentation format – PowerPoint PPT presentation

Number of Views:99
Avg rating:3.0/5.0
Slides: 76
Provided by: PARC158
Learn more at: http://web.stanford.edu
Category:

less

Transcript and Presenter's Notes

Title: Computing Textual Inferences


1
Computing Textual Inferences
  • Cleo Condoravdi
  • Palo Alto Research Center
  • Georgetown University
  • Halloween, 2008

2
Overview
  • Motivation
  • Local Textual Inference
  • Textual Inference Initiatives and refinements
  • PARCs BRIDGE system
  • XLE
  • Abstract Knowledge Representation (AKR)
  • Conceptual and temporal structure
  • Contextual structure and instantiability
  • Semantic relations
  • Entailments and presuppositions
  • Relative polarity
  • Entailment and Contradiction (ECD)
  • (Interaction with external temporal reasoner)

3
Access to content existential claimsWhat
happened? Who did what to whom?
  • Microsoft managed to buy Powerset.
  • ? Microsoft acquired Powerset.
  • Shackleton failed to get to the South Pole.
  • ? Shackleton did not reach the South Pole.
  • The destruction of the file was not illegal.
  • ? The file was destroyed.
  • The destruction of the file was averted.
  • ? The file was not destroyed.

4
Access to content monotonicityWhat happened?
Who did what to whom?
  • Every boy managed to buy a small toy.
  • ? Every small boy acquired a toy.
  • Every explorer failed to get to the South Pole.
  • ? No experienced explorer reached the South Pole.
  • No file was destroyed.
  • ? No sensitive file was destroyed.
  • The destruction of a sensitive file was averted.
  • ? A file was not destroyed.

5
Access to content temporal domainWhat happened
when?
  • Ed visited us every day last week.
  • ? Ed visited us on Monday last week.
  • Ed has been living in Athens for 3 years.
  • Mary visited Athens in the last 2 years.
  • ? Mary visited Athens while Ed lived in Athens.
  • The deal lasted through August, until just before
    the government took over Freddie. (NYT, Oct. 5,
    2008)
  • ? The government took over Freddie after August.

6
Grammatical analysis for access to content
  • Identify Microsoft as the buyer argument of
    the verb buy
  • Identify Shackleton as the traveler argument of
    the verb get to
  • Identify lexical relation between destroy and
    destruction
  • Identify syntactic relation between verbal
    predication destroy the file and nominal
    predication destruction of the file
  • Identify the infinitival clause as an argument of
    manage and fail
  • Identify the noun phrases every, a, no
    combine with
  • Identify the phrases every day, on Monday,
    last week as modifiers of visit
  • Identify has been living as a present
    progressive

7
Knowledge about words for access to content
  • The verb acquire is a hypernym of the verb
    buy
  • The verbs get to and reach are synonyms
  • Inferential properties of manage, fail,
    avert, not
  • Monotonicity properties of every, a, no,
    not
  • Restrictive behavior of adjectival modifiers
    small, experienced, sensitive
  • The type of temporal modifiers associated with
    prepositional phrases headed by in, for,
    on, or even nothing (e.g. last week, every
    day)

8
Toward NL Understanding
  • Local Textual Inference
  • A measure of understanding a text is the ability
    to make inferences based on the information
    conveyed by it. We can test understanding by
    asking questions about the text.
  • Veridicality reasoning
  • Did an event mentioned in the text actually
    occur?
  • Temporal reasoning
  • When did an event happen? How are events ordered
    in time?
  • Spatial reasoning
  • Where are entities located and along which paths
    do they move?
  • Causality reasoning
  • Enablement, causation, prevention relations
    between events

9
Local Textual Inference
  • PASCAL RTE Challenge (Ido Dagan, Oren Glickman)
    2005, 2006
  • PREMISE
  • CONCLUSION
  • TRUE/FALSE
  • Rome is in Lazio province and Naples is in
    Campania.
  • Rome is located in Lazio province.
  • TRUE ( entailed by the premise)
  • Romano Prodi will meet the US President George
    Bush in his capacity as the president of the
    European commission.
  • George Bush is the president of the European
    commission.
  • FALSE ( not entailed by the premise)

10
PARC Entailment and Contradiction Detection (ECD)
  • Text Kim hopped.
  • Hypothesis Someone moved.
  • Answer TRUE
  • Text Sandy touched Kim.
  • Hypothesis Sandy kissed Kim.
  • Answer UNKNOWN
  • Text Sandy kissed Kim.
  • Hypothesis No one touched Kim.
  • Answer NO
  • Text Sandy didnt wait to kiss Kim.
  • Hypothesis Sandy kissed Kim.
  • Answer AMBIGUOUS

11
Linguistic meaning vs. speaker meaning
  • Not a pre-theoretic but rather a theory-dependent
    distinction
  • Multiple readings
  • ambiguity of meaning?
  • single meaning plus pragmatic factors?
  • The diplomat talked to most victims
  • The diplomat did not talk to all victims
  • UNKNOWN / YES

I dont know which.
You can have the cake or the fruit. You can
have the fruit
YES
UNKNOWN
12
World Knowledge
  • Romano Prodi will meet the US President George
    Bush in his capacity as the president of the
    European commission.
  • George Bush is the president of the European
    commission.
  • FALSE ( not entailed by the premise on the
    correct anaphoric resolution)
  • G. Karas will meet F. Rakas in his capacity as
    the president of the European commission.
  • F. Rakas is the president of the European
    commission.
  • TRUE ( entailed by the premise on one anaphoric
    resolution)

13
Recognizing textual entailments
  • Monotonicity Calculus, Polarity, Semantic
    Relations
  • Much of language-oriented reasoning is tied to
    specific words, word classes and grammatical
    features
  • Class 1 fail, refuse, not,
  • Class 2 manage, succeed,
  • Tenses, progressive ing form,
  • Representation and inferential properties of
    modifiers of different kinds
  • throughout July vs. in July
  • for the last three years vs. in the
    last three years
  • sleep three hours -- duration
  • sleep three times -- cardinality

14
XLE Pipeline
  • Mostly symbolic system
  • Ambiguity-enabled through packed representation
    of analyses
  • Filtering of dispreferred/improbable analyses is
    possible
  • OT marks
  • mostly on c-/f-structure pairs, but also on
    c-structures
  • on semantic representations for selectional
    preferences
  • Statistical models
  • PCFG-based pruning of the chart of possible
    c-structures
  • Log-linear model that selects n-best
    c-/f-structure pairs

morphological analyses
CSTRUCTURE OT marks PCFG-based chart pruning
c-structures
general OT marks log-linear model
c-/f-structure pairs
15
Ambiguity is rampant in language
  • Alternatives multiply within and across layers

C-structure
F-structure
Linguistic Sem
Abstract KR
KRR
16
What not to do
  • Use heuristics to prune as soon as possible

X
X
X
C-structure
F-structure
Linguistic sem
Abstract KR
KRR
X
Fast computation, wrong result
17
Manage ambiguity instead
How many sheep? How many fish?
The sheep liked the fish.
  • Packed representation
  • Encodes all dependencies without loss of
    information
  • Common items represented, computed once
  • Key to practical efficiency with broad-coverage
    grammars

18
System Overview
19
XLE Pipeline
Process Output
Text-Breaking Delimited Sentences
NE recognition Type-marked Entities (names, dates, etc.)
Morphological Analysis Word stems features
LFG parsing Functional Representation
Semantic Processing Scope, Predicate-argument structure
AKR Rules Abstract Knowledge Representation
Alignment Aligned T-H Concepts and Contexts
Entailment and Contradiction Detection YES / NO / UNKNOWN
20
XLE System ArchitectureText ? AKR
  • Parse text to f-structures
  • Constituent structure
  • Represent syntactic/semantic features (e.g.
    tense, number)
  • Localize arguments (e.g. long-distance
    dependencies, control)
  • Rewrite f-structures to AKR clauses
  • Collapse syntactic alternations (e.g.
    active-passive)
  • Flatten embedded linguistic structure to clausal
    form
  • Map to concepts and roles in some ontology
  • Represent intensionality, scope, temporal
    relations
  • Capture commitments of existence/occurrence

21
AKR representation
A collection of statements
22
F-structures vs. AKR
  • Nested structure of f-structures vs. flat AKR
  • F-structures make syntactically, rather than
    conceptually, motivated distinctions
  • Syntactic distinctions canonicalized away in AKR
  • Verbal predications and the corresponding
    nominalizations or deverbal adjectives with no
    essential meaning differences
  • Arguments and adjuncts map to roles
  • Distinctions of semantic importance are not
    encoded in f-structures
  • Word senses
  • Sentential modifiers can be scope taking
    (negation, modals, allegedly, predictably)
  • Tense vs. temporal reference
  • Nonfinite clauses have no tense but they do have
    temporal reference
  • Tense in embedded clauses can be past but
    temporal reference is to the future

23
F-Structure to AKR Mapping
  • Input F-structures
  • Output clausal, abstract KR
  • Mechanism packed term rewriting
  • Rewriting system controls
  • lookup of external ontologies via Unified
    Lexicon
  • compositionally-driven transformation to AKR
  • Transformations
  • Map words to Wordnet synsets
  • Canonicalize semantically equivalent but formally
    distinct representations
  • Make conceptual intensional structure explicit
  • Represent semantic contribution of particular
    constructions

24
F-Structure to AKR Mapping
  • Input F-structures
  • Output clausal, abstract KR
  • Mechanism packed term rewriting
  • Rewriting system controls
  • lookup of external ontologies via Unified
    Lexicon
  • compositionally-driven transformation to AKR
  • Transformations
  • Map words to Wordnet synsets
  • Canonicalize semantically equivalent but formally
    distinct representations
  • Make conceptual intensional structure explicit
  • Represent semantic contribution of particular
    constructions

25
Basic structure of AKR
  • Conceptual Structure
  • Predicate-argument structures
  • Sense disambiguation
  • Associating roles to arguments and modifiers
  • Contextual Structure
  • Clausal complements
  • Negation
  • Sentential modifiers
  • Temporal Structure
  • Representation of temporal expressions
  • Tense, aspect, temporal modifiers

26
Ambiguity management with choice spaces
27
Conceptual Structure
  • Captures basic predicate-argument structures
  • Maps words to WordNet synsets
  • Assigns VerbNet roles

subconcept(talk4,talk-1,talk-2,speak-3,spill-5,s
pill_the_beans-1,lecture-1) role(Actor,talk4,Ed
1) subconcept(Ed1,male-2) alias(Ed1,Ed) role
(cardinality_restriction,Ed1,sg)
  • Shared by Ed talked, Ed did not talk and
    Bill will say that Ed talked.

28
Temporal Structure
  • Matrix vs. embedded tense
  • temporalRel(startsAfterEndingOf,Now,talk6)

Shared by Ed talked. and Ed did not talk.
29
Canonicalization in conceptual structure
  • subconcept(tour13,tour-1)
  • role(Theme,tour13,John1) role(Location,tour13,E
    urope21) 
  • subconcept(Europe21,location-1)
  • alias(Europe21,Europe) 
  • role(cardinality_restriction,Europe21,sg)
  • subconcept(John1,male-2)
  • alias(John1,John)
  • role(cardinality_restriction,John1,sg)
  • subconcept(travel6,travel-1,travel-2,travel-3,tr
    avel-4,travel-5,travel-6)
  • role(Theme,travel6,John1)
  • role(Location,travel6,Europe22)
  • subconcept(Europe22,location-1)
  • alias(Europe22,Europe)
  • role(cardinality_restriction,Europe22,sg)
  • subconcept(John1,male-2)
  • alias(John1,John)
  • role(cardinality_restriction,John1,sg)

John took a tour of Europe.
John traveled around Europe.
30
Contextual Structure
  • Use of contexts enables flat representations
  • Contexts as arguments of embedding predicates
  • Contexts as scope markers
  • context(t)
  • context(ctx(talk29))
  • context(ctx(want19))
  • top_context(t)
  • context_relation(t,ctx(want19),crel(Topic,say6))
  • context_relation(ctx(want19),ctx(talk29),crel(Th
    eme,want19))

Bill said that Ed wanted to talk.
31
Concepts and Contexts
  • Concepts live outside of contexts.
  • Still we want to tie the information about
    concepts to the contexts they relate to.
  • Existential commitments
  • Did something happen?
  • e.g. Did Ed talk? Did Ed talk according to Bill?
  • Does something exist?
  • e.g. There is a cat in the yard. There is no cat
    in the yard.

32
Instantiability
  • An instantiability assertion of a
    concept-denoting term in a context implies the
    existence of an instance of that concept in that
    context.
  • An uninstantiability assertion of a
    concept-denoting term in a context implies there
    is no instance of that concept in that context.
  • If the denoted concept is of type event, then
    existence/nonexistence corresponds to truth or
    falsity.

33
Negation
Ed did not talk
  • Contextual structure
  • context(t)context(ctx(talk12)) new context
    triggered by negationcontext_relation(t,
    ctx(talk12), not8)antiveridical(t,ctx(talk12))
    interpretation of negation
  • Local and lifted instantiability assertions
  • instantiable(talk12, ctx(talk12))
  • uninstantiable (talk12, t) entailment
    of negation

34
Relations between contexts
  • Generalized entailment veridical
  • If c2 is veridical with respect to c1,
    the information in c2 is part of the information
    in c1
  • Lifting rule instantiable(Sk, c2) gt
    instantiable(Sk, c1)
  • Inconsistency antiveridical
  • If c2 is antiveridical with respect to c1,
    the information in c2 is incompatible with the
    info in c1
  • Lifting rule instantiable(Sk, c2) gt
    uninstantiable(Sk, c1)
  • Consistency averidical
  • If c2 is averidical with respect to c1,
    the info in c2 is compatible with the information
    in c1
  • No lifting rule between contexts

35
Determinants of context relations
  • Relation depends on complex interaction of
  • Concepts
  • Lexical entailment class
  • Syntactic environment
  • Example
  • He didnt remember to close the window.
  • He doesnt remember that he closed the window.
  • He doesnt remember whether he closed the window.
  • He closed the window.
  • Contradicted by 1
  • Implied by 2
  • Consistent with 3

36
Embedded clauses
  • The problem is to infer whether an embedded event
    is instantiable or uninstantiable on the top
    level.
  • It is surprising that there are no WMDs in Iraq.
  • It has been shown that there are no WMDs in Iraq.
  • gt There are no WMDs in Iraq.

37
Embedded examples in real text
  • From Google
  • Song, Seoul's point man, did not forget to
    persuade the North Koreans to make a strategic
    choice of returning to the bargaining table...

Song persuaded the North Koreans
The North Koreans made a strategic choice
38
Semantic relations
  • Presupposition (Factive verbs, Implicative verbs)
  • It is surprising that there are no WMDs in Iraq.
  • It is not surprising that there are no WMDs in
    Iraq.
  • Is it surprising that there are no WMDs in Iraq?
  • If it is surprising that there are no WMDs in
    Iraq, it is because we had good reasons to think
    otherwise.
  • Entailment (Implicative verbs)
  • It has been shown that there are no WMDs in Iraq.
  • It has not been shown that there are no WMDs in
    Iraq.
  • Has it been shown that there are no WMDs in Iraq?
  • If it has been shown that there are no WMDs in
    Iraq, the war has turned out to be a mistake.

39
Factives
Class
Inference Pattern
-/ forget that forget that X ? X, not forget that X ? X
-/- pretend that pretend that X ? not X, not pretend that X ? not X
Positive
Negative
40
Implicatives
Class
Inference Pattern
/-- manage to -/- fail to manage to X ? X, not manage to X ? not X fail to X ? not X, not fail to X ? X
force to force X to Y ? Y
- prevent from prevent X from Ying ? not Y
-- be able to not be able to X ? not X
- hesitate to not hesitate to X ? X
Two-way implicatives
One-way implicatives
41
Implicatives under Factives
  • It is surprising that Bush dared to lie.

Bush lied.
It is not surprising that Bush dared to lie.
42
Phrasal Implicatives
43
Conditional verb classes
Two-way implicative with character nouns
  • Joe had the chutzpah to steal the money. ? Joe
    stole the money.

character noun (gall, gumption, audacity)
44
Relative Polarity
  • Veridicality relations between contexts
    determined on the basis of a recursive
    calculation of the relative polarity of a given
    embedded context
  • Globality The polarity of any context depends on
    the sequence of potential polarity switches
    stretching back to the top context
  • Top-down each complement-taking verb or other
    clausal modifier, based on its parent context's
    polarity, either switches, preserves or simply
    sets the polarity for its embedded context

45
Example polarity propagation
  • Ed did not forget to force Dave to leave.
  • Dave left.

46

not
comp
-
forget
subj
comp

Ed
force

obj
subj
comp
Dave
leave
Ed
subj
Dave
47
Summary of basic structure of AKR
  • Conceptual Structure
  • Terms representing types of individuals and
    events, linked to WordNet synonym sets by
    subconcept declarations.
  • Concepts typically have roles associated with
    them.
  • Ambiguity is encoded in a space of alternative
    choices.
  • Contextual Structure
  • t is the top-level context, some contexts are
    headed by some event term
  • Clausal complements, negation and sentential
    modifiers also introduce contexts.
  • Contexts can be related in various ways such as
    veridicality.
  • Instantiability declarations link concepts to
    contexts.
  • Temporal Structure
  • Locating events in time.
  • Temporal relations between events.

48
ECD
  • ECD operates on the AKRs of the passage and of
    the hypothesis
  • ECD operates on packed AKRs, hence no
    disambiguation is required for entailment and
    contradiction detection
  • If one analysis of the passage entails one
    analysis of the hypothesis and another analysis
    of the passage contradicts some other analysis of
    the hypothesis, the answer returned is AMBIGUOUS
  • Else If one analysis of the passage entails one
    analysis of the hypothesis, the answer returned
    is YES
  • If one analysis of the passage
    contradicts one analysis of the hypothesis, the
    answer returned is NO
  • Else The answer returned is UNKNOWN

49
AKR (Abstract Knowledge Representation)
50
More specific entails less specific
51
How ECD works
Kim hopped.
Text
Alignment
Someone moved.
Hypothesis
Kim hopped.
Text
Specificity computation
Someone moved.
Hypothesis
Kim hopped.
Text
Elimination of H facts that are entailed by T
facts.
Hypothesis
Someone moved.
52
Alignment and specificity computation
Every boy saw a small cat.
Text
Alignment
Every small boy saw a cat.
Hypothesis
Every boy saw a small cat.
Text
Every small boy saw a cat.
Hypothesis
Specificity computation
Every boy saw a small cat.
Text
Every small boy saw a cat.
Hypothesis
Every (?) (?)
Some (?) (?)
53
Elimination of entailed terms
Context
54
Contradictioninstantiable --- uninstantiable
55
Stages of ECD
  • 1. WordNet and Alias alignment for
    (un)instantiable concepts in conclusion
  • 1a Returns lt gt depending on hyperlists
    of terms
  • 1b Returns lt gt depending on theory of
    names (assuming 1a matched)
  • 2. Make extra top contexts for special cases
    e.g. Making head of question (below)
    interrogative a top_context
  • 3. Context alignment
  • Any top context in conclusion aligns with any
    top context in premise
  • Any non-top_context in conclusion aligns with
    any non top_context in premise if their
    context_heads align in stage 1
  • 4. paired_roles are saved (roles with the same
    role name in premise and conclusion on aligned
    concepts)

56
Stages of ECD
  • 6. unpaired roles in premise and conclusion
    (both) makes concepts not align.
  • 7. cardinality restrictions on concepts are
    checked and modify alignment direction (including
    dropping inconsistent alignments)
  • 8. Paired roles are checked to see how their
    value specificity affects alignment
  • 9. Temporal modifiers are used to modify
    alignment
  • 10. Instantiable concepts in the conclusion are
    removed if there is an more specific concept
    instantiable in an aligned context in premise.
  • 11. Conversely for uninstantiable
  • 12. Contradiction checked (instantiable in
    premise and uninstantiable in conclusion, and
    vice versa)

57
AKR modifications
Oswald killed Kennedy gt Kennedy died.
P-AKR
augment
The situation improved.
normalize
AKR0
gt
The situation became better.
simplify
Q-AKR
Kim managed to hop. gt Kim hopped.
58
From temporal modifiers to temporal relations
  • Inventory of temporal relations the Allen
    relations plus certain disjunctions thereof
  • Recognize the type of temporal modifier
  • e.g. bare modifiers, in PPs, for PPs
  • Ed visited us Monday/that week/every day.
  • Ed slept the last two hours.
  • Ed will arrive a day from/after tomorrow.
  • Represent the interval specified in the temporal
    modifier
  • Locate intervals designated by temporal
    expressions on time axis
  • Determine qualitative relations among time
    intervals

59
Interpretation of temporal expressions
  • Compositional make-up determines qualitative
    relations
  • Relative ordering can be all a sentence specifies
  • Reference of calendrical expressions depends on
    interpretation of tense
  • Two different computations
  • Determine qualitative relations among time
    intervals
  • Locate intervals designated by temporal
    expressions on time axis
  • Infer relations not explicitly mentioned in text
  • Some through simple transitive closure
  • Others require world/domain knowledge

60
Temporal modification under negation and
quantification
  • Temporal modifiers affect monotonicity-based
    inferences
  • Everyone arrived in the first week of July 2000.
  • Everyone arrived in July 2000.
  • YES
  • No one arrived in July 2000.
  • No one arrived in the first week of July 2000.
  • YES
  • Everyone stayed throughout the concert.
  • Everyone stayed throughout the first part of the
    concert.
  • YES
  • No one stayed throughout the concert.
  • No one stayed throughout the first part of the
    concert.
  • UNKNOWN

61
Quantified modifiers and monotonicity
  • Many inference patterns do not depend on
    calendrical anchoring but on basic monotonicity
    properties
  • Monotonicity-based inferences depend on implicit
    dependencies being represented
  • Last year, in September, he visited us every day.
  • Last year he visited us every day.
  • UNKNOWN
  • Last year he visited us every day.
  • Last year he visited us every day in September.
  • YES
  • Every boy bought a toy from Ed.
  • Every boy bought a toy.
  • YES
  • Every boy bought a toy.
  • Every boy bought a toy from Ed.
  • UNKNOWN

62
Allen Interval Relations
Relation Illustration Interpretation
X lt Y Y gt X X _ _ Y _ X takes place before Y
X m Y Y mi X _ X _ _ Y _ X meets Y (i stands for inverse
X o YY oi X _ X _ _ Y _ X overlaps Y
X s YY si X _ X __ Y _ X starts Y
X d Y Y di X _ X __ Y _ X during Y
X F Y Y fi X _ X _ _ Y _ X finishes Y
X Y _ X _ _ Y _ X is equal to Y(X is cotemporal with Y)
63
Qualitative relations of intervals and events
within
NOW
throughout
Left boundary
Right boundary
Determining the relevant interval
Determining the relation between interval and
event
Taking negation and quantification into
consideration
64
From language to qualitative relations of
intervals and events
Eds living in Athens
Marys visit to Athens
within
NOW
throughout
Left boundary
Right boundary
Left boundary
Ed has been living in Athens for 3 years. Mary
visited Athens in the last 2 years. ? Mary
visited Athens while Ed lived in Athens.
65
From English to AKR
  • Ed has been living in Athens for 3 years.
  • trole(duration,extended_now13,interval_size(3,yea
    r17))
  • trole(when,extended_now13,interval(finalOverlap,N
    ow))
  • trole(when,live3,interval(includes,extended_now1
    3)
  • Mary visited Athens in the last 2 years.
  • trole(duration,extended_now10,interval_size(2,yea
    r11))
  • trole(when,extended_now10,interval(finalOverlap,N
    ow))
  • trole(when,visit2,interval(included_in,extended_n
    ow10))
  • Mary visited Athens while Ed lived in Athens.
  • trole(ev_when,live22,interval(includes,visit6))
  • trole(ev_when,visit6,interval(included_in,live22
    ))

66
Quantified modifiers and monotonicity
  • Many inference patterns do not depend on
    calendrical anchoring but on basic monotonicity
    properties
  • Monotonicity-based inferences depend on implicit
    dependencies being represented
  • Last year, in September, he visited us every day.
  • Last year he visited us every day.
  • UNKNOWN
  • Last year he visited us every day.
  • Last year he visited us every day in September.
  • YES
  • Every boy bought a toy from Ed.
  • Every boy bought a toy.
  • YES
  • Every boy bought a toy.
  • Every boy bought a toy from Ed.
  • UNKNOWN

67
Distributed modifiers
  • Multiple temporal modifiers are dependent on one
    another
  • Implicit dependencies are made explicit in the
    representation
  • Ed visited us in July, 1991.
  • trole(when,visit1,interval(included_in,datemonth
    (7)18))
  • trole(subinterval,datemonth(7)18,dateyear(1991)
    18)
  • In 1991 Ed visited us in July.
  • trole(when,visit12,interval(included_in,datemont
    h(7)26))
  • trole(subinterval,datemonth(7)26,dateyear(1991)
    4)
  • In 1991 Ed visited us in July every week.
  • trole(when,visit12,interval(included_in,week37))
  • trole(subinterval,week37,datemonth(7)26)
  • trole(subinterval,datemonth(7)26,dateyear(1991)
    4)

68
Associating time points with event descriptions
  • Trilobites 540m-251m years ago
  • Ammonites 400m-65m years ago
  • 1. There were trilobites before there were
  • ammonites. TRUE
  • 2. There were ammonites before there were
  • trilobites. FALSE
  • 3. There were trilobites after there were
  • ammonites. TRUE
  • 4. There were ammonites after there were
  • trilobites. TRUE

69
Associating time points with event descriptions
  • 1. Ed felt better before every injection was
    administered to him.
  • ordering wrt last injection
  • 2. Ed felt better after every injection was
    administered to him.
  • ordering wrt last injection
  • 3. Ed felt better before most injections were
    administered to him.
  • ordering wrt first injection to tip the balance
  • 4. Ed felt better after most injections were
    administered to him.
  • ordering wrt first injection to tip the balance

70
How before and after order
  • In a modifier of the form before S or after S, we
    need to derive from S a temporal value to pass on
    to the preposition.
  • The default operation takes the end of the
    earliest interval when S is true.
  • The temporal asymmetry of this operation produces
    the appearance of after and before being
    non-inverses.

71
TimeBank and TimeML
  • A corpus of 183 news articles annotated by hand
    with temporal information following the TimeML
    specification
  • Events, times and temporal links between them are
    identified and texts are appropriately annotated
  • TimeML represents temporal information using four
    primary tag types
  • TIMEX3 for temporal expressions
  • EVENT for temporal events
  • SIGNAL for temporal signals
  • LINK for representing relationships
  • Semantics of TimeML an open issue

72
TimeML and AKR
73
TimeML-AKR match
74
Credits for the Bridge System
  • NLTT (Natural Language Theory and Technology)
    group at PARC
  • Daniel Bobrow
  • Bob Cheslow
  • Cleo Condoravdi
  • Dick Crouch
  • Ronald Kaplan
  • Lauri Karttunen
  • Tracy King now at Powerset
  • John Maxwell
  • Valeria de Paiva now at Cuil
  • Annie Zaenen
  • Interns
  • Rowan Nairn
  • Matt Paden
  • Karl Pichotta
  • Lucas Champollion

75
Thank you
Write a Comment
User Comments (0)
About PowerShow.com