Parsers and Grammars - PowerPoint PPT Presentation

1 / 173
About This Presentation
Title:

Parsers and Grammars

Description:

... and Grammars. Colin Phillips. Outline. The Standard History of Psycholinguistics ... Emergence of independent psycholinguistics ... – PowerPoint PPT presentation

Number of Views:65
Avg rating:3.0/5.0
Slides: 174
Provided by: ColinPh1
Category:

less

Transcript and Presenter's Notes

Title: Parsers and Grammars


1
Parsers and Grammars
Colin Phillips
2
Outline
  • The Standard History of Psycholinguistics
  • Parsing and rewrite rules
  • Initial optimism
  • Disappointment and the DTC
  • Emergence of independent psycholinguistics
  • Reevaluating relations between competence and
    performance systems

3
Standard View
324 697 ?
217 x 32 ?
arithmetic
4
Standard View
specialized algorithm
specialized algorithm
324 697 ?
217 x 32 ?
arithmetic
5
Standard View
specialized algorithm
specialized algorithm
324 697 ?
217 x 32 ?
?
arithmetic
something deeper
6
Standard View
specialized algorithm
specialized algorithm
understanding
speaking
grammaticalknowledge,competence
language
recursive characterization ofwell-formed
expressions
7
Standard View
specialized algorithm
specialized algorithm
understanding
speaking
precisebut ill-adapted toreal-time operation
grammaticalknowledge,competence
language
recursive characterization ofwell-formed
expressions
8
Standard View
specialized algorithm
specialized algorithm
understanding
speaking
well-adapted toreal-time operationbut maybe
inaccurate
grammaticalknowledge,competence
language
recursive characterization ofwell-formed
expressions
9
Grammatical Knowledge
  • How is grammatical knowledge accessed in
    syntactic computation for...(a) grammaticality
    judgment(b) understanding(c) speaking
  • Almost no proposals under standard view
  • This presents a serious obstacle to unification
    at the level of syntactic computation

10
(No Transcript)
11
Townsend Bever (2001, ch. 2)
  • Linguists made a firm point of insisting that,
    at most, a grammar was a model of competence -
    that is, what the speaker knows. This was
    contrasted with effects of performance, actual
    systems of language behaviors such as speaking
    and understanding. Part of the motive for this
    distinction was the observation that sentences
    can be intuitively grammatical while being
    difficult to understand, and conversely.

12
Townsend Bever (2001, ch. 2)
  • Despite this distinction the syntactic model
    had great appeal as a model of the processes we
    carry out when we talk and listen. It was
    tempting to postulate that the theory of what we
    know is a theory of what we do, thus answering
    two questions simultaneously.1. What do we know
    when we know a language?2. What do we do when we
    use what we know?

13
Townsend Bever (2001, ch. 2)
  • It was assumed that this knowledge is linked to
    behavior in such a way that every syntactic
    operation corresponds to a psychological process.
    The hypothesis linking language behavior and
    knowledge was that they are identical.

14
Miller (1962)
  • 1. Mary hit Mark. K(ernel)2. Mary did not hit
    Mark. N3. Mark was hit by Mary. P4. Did
    Mary hit Mark? Q5. Mark was not hit by
    Mary. NP6. Didnt Mary hit Mark? NQ7. Was
    Mark hit by Mary? PQ8. Wasnt Mark hit by
    Mary? PNQ

15
Miller (1962)
Transformational Cube
16
Townsend Bever (2001, ch. 2)
  • The initial results were breathtaking. The
    amount of time it takes to produce a sentence,
    given another variant of it, is a function of the
    distance between them on the sentence cube.
    (Miller McKean 1964).It is hard to convey
    how exciting these developments were. It appeared
    that there was to be a continuing direct
    connection between linguistic and psychological
    research. The golden age had arrived.

17
Townsend Bever (2001, ch. 2)
  • Alas, it soon became clear that either the
    linking hypothesis was wrong, or the grammar was
    wrong, or both.

18
Townsend Bever (2001, ch. 2)
  • The moral of this experience is clear. Cognitive
    science made progress by separating the question
    of what people understand and say from how they
    understand and say it. The straightforward
    attempt to use the grammatical model directly as
    a processing model failed. The question of what
    humans know about language is not only distinct
    from how children learn it, it is distinct from
    how adults use it.

19
A Simple Derivation
S (starting axiom)
S
20
A Simple Derivation
S (starting axiom)1. S ? NP VP
S
NP
VP
21
A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP
S
NP
VP
V
NP
22
A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N
S
NP
VP
V
NP
D
N
23
A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill
S
NP
VP
V
NP
Bill
D
N
24
A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit
S
NP
VP
V
NP
Bill
hit
D
N
25
A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the
S
NP
VP
V
NP
Bill
hit
D
N
the
26
A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the7. N ? ball
S
NP
VP
V
NP
Bill
hit
D
N
the
ball
27
A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the7. N ? ball
S
NP
VP
V
NP
Bill
hit
D
N
the
ball
28
Reverse the derivation...
29
A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the7. N ? ball
Bill
30
A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the7. N ? ball
NP
Bill
31
A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the7. N ? ball
NP
Bill
hit
32
A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the7. N ? ball
NP
V
Bill
hit
33
A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the7. N ? ball
NP
V
Bill
hit
the
34
A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the7. N ? ball
NP
V
Bill
hit
D
the
35
A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the7. N ? ball
NP
V
Bill
hit
D
the
ball
36
A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the7. N ? ball
NP
V
Bill
hit
D
N
the
ball
37
A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the7. N ? ball
NP
V
NP
Bill
hit
D
N
the
ball
38
A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the7. N ? ball
NP
VP
V
NP
Bill
hit
D
N
the
ball
39
A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the7. N ? ball
S
NP
VP
V
NP
Bill
hit
D
N
the
ball
40
A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the7. N ? ball
S
NP
VP
V
NP
Bill
hit
D
N
the
ball
41
Transformations
  • wh-movement
  • X wh-NP Y
  • 1 2 3
  • --gt 2 1 0 3

42
Transformations
  • VP-ellipsis
  • X VP1 Y VP2 Z
  • 1 2 3 4 5
  • --gt 1 2 3 0 5
  • condition VP1 VP2

43
Difficulties
  • How to build structure incrementally in
    right-branching structures
  • How to recognize output of transformations that
    create nulls

44
Summary
  • Running the grammar backwards is not so
    straightforward - problems of indeterminacy and
    incrementality
  • Disappointment in empirical tests of Derivational
    Theory of Complexity
  • Unable to account for processing of local
    ambiguities

45
Standard View
specialized algorithm
specialized algorithm
understanding
speaking
grammaticalknowledge,competence
language
recursive characterization ofwell-formed
expressions
46
Grammatical Knowledge
  • How is grammatical knowledge accessed in
    syntactic computation for...(a) grammaticality
    judgment(b) understanding(c) speaking
  • Almost no proposals under standard view
  • This presents a serious obstacle to unification
    at the level of syntactic computation

47
(No Transcript)
48
Arguments for Architecture
  • 1. Available grammars dont make good parsing
    devices
  • 2. Grammaticality ? Parsability
  • 3. Failure of DTC
  • 4. Evidence for parser-specific structure
  • 5. Parsing/production have distinct properties
  • 6. Possibility of independent damage to
    parsing/production
  • 7. Competence/performance distinction is
    necessary, right?

49
Arguments for Architecture
  • 1. Available grammars dont make good parsing
    devices
  • 2. Grammaticality ? Parsability
  • 3. Failure of DTC
  • 4. Evidence for parser-specific structure
  • 5. Parsing/production have distinct properties
  • 6. Possibility of independent damage to
    parsing/production
  • 7. Competence/performance distinction is
    necessary, right?

50
Grammar as Parser - Problems
  • Incremental structure building with PS Rules
    (e.g. S -gt NP VP)
  • delay
  • prediction/guessing
  • Indeterminacy ( how to recover nulls created by
    transformations)

51
Grammar as Parser - Solutions
  • Lexicalized grammars make incremental
    structure-building much easier (available in
    HPSG, minimalism, LFG, Categorial Grammar, etc.
    etc.)

VP
VP -gt V PPPP -gt P NP
V
PP
sat
NP
P
on
the rug
52
Grammar as Parser - Solutions
  • Lexicalized grammars make incremental
    structure-building much easier (available in
    HPSG, minimalism, LFG, Categorial Grammar, etc.
    etc.)

VP
sitcomp __ Poncomp __ N
V
PP
sat
NP
P
on
the rug
53
Grammar as Parser - Solutions
  • Problem of seeking nulls in movement structures

54
Transformations
  • wh-movement
  • X wh-NP Y
  • 1 2 3
  • --gt 2 1 0 3

55
Transformations
  • VP-ellipsis
  • X VP1 Y VP2 Z
  • 1 2 3 4 5
  • --gt 1 2 3 0 5
  • condition VP1 VP2

56
Grammar as Parser - Solutions
  • Problem of seeking nulls in movement structures
  • becomes problem of seeking licensing features
    for displaced phrases, e.g. for wh-phrase, seek
    Case assigner and thematic role assigner.
  • Requirement to find licensing features is a basic
    component of all syntactic composition

57
Incremental Structure Building
  • An investigation of the grammatical consequences
    of incremental, left-to-right structure building

58
Incremental Structure Building
A
59
Incremental Structure Building
A
B
60
Incremental Structure Building
A
B
C
61
Incremental Structure Building
A
B
C
D
62
Incremental Structure Building
A
B
C
D
E
63
Incremental Structure Building
A
B
64
Incremental Structure Building
A
B
constituent
65
Incremental Structure Building
A
B
C
constituent is destroyed by addition of new
material
66
Incremental Structure Building
A
B
C
67
Incremental Structure Building
A
constituent
B
C
68
Incremental Structure Building
A
B
C
D
constituent is destroyed by addition of new
material
69
Incremental Structure Building
the cat
70
Incremental Structure Building
the cat
sat
71
Incremental Structure Building
the cat
sat
on
72
Incremental Structure Building
the cat
sat
on
the rug
73
Incremental Structure Building
the cat
sat
on
74
Incremental Structure Building
the cat
sat
on
the rug
75
Incremental Structure Building
the cat
sat
sat on is a temporary constituent, which is
destroyed as soon as the NP the rug is added.
on
the rug
76
Incremental Structure Building
  • Conflicting Constituency Tests
  • Verb Preposition sequences can undergo
    coordination
  • (1) The cat sat on and slept under the rug.
  • but cannot undergo pseudogapping (Baltin
    Postal, 1996)
  • (2) The cat sat on the rug and the dog did the
    chair.

77
Incremental Structure Building
the cat
sat
on
78
Incremental Structure Building
the cat
and
sat
on
slept
under
79
Incremental Structure Building
the cat
coordination applies early, before the VP
constituent is destroyed.
and
sat
on
slept
under
80
Incremental Structure Building
the cat
sat
on
81
Incremental Structure Building
the cat
sat
on
the rug
82
Incremental Structure Building
the cat
and
the dog
did
sat
on
the rug
83
Incremental Structure Building
the cat
and
the dog
did
sat
pseudogapping applies too late, after the VP
constituent is destroyed.
on
the rug
84
Incremental Structure Building
  • Constituency ProblemDifferent diagnostics of
    constituency frequently yield conflicting results
  • Incrementality Hypothesis(a) Structures are
    assembled strictly incrementally(b) Syntactic
    processes see a snapshot of a derivation - they
    target constituents that are present when the
    process applies(c) Conflicts reflect the simple
    fact that different processes have different
    linear properties
  • Applied to interactions among binding, movement,
    ellipsis, prosodic phrasing, clitic placement,
    islands, etc. (Phillips 1996, in press Richards
    1999, 2000 Guimaraes 1999 etc.)

85
Interim Conclusion
  • Grammatical derivations look strikingly like the
    incremental derivations of a parsing system
  • But we want to be explicit about this, so...

86
Computational Modeling
(Schneider 1999 Schneider Phillips, 1999)
87
(No Transcript)
88
Arguments for Architecture
  • 1. Available grammars dont make good parsing
    devices
  • 2. Grammaticality ? Parsability
  • 3. Failure of DTC
  • 4. Evidence for parser-specific structure
  • 5. Parsing/production have distinct properties
  • 6. Possibility of independent damage to
    parsing/production
  • 7. Competence/performance distinction is
    necessary, right?

89
Townsend Bever (2001, ch. 2)
  • Linguists made a firm point of insisting that,
    at most, a grammar was a model of competence -
    that is, what the speaker knows. This was
    contrasted with effects of performance, actual
    systems of language behaviors such as speaking
    and understanding. Part of the motive for this
    distinction was the observation that sentences
    can be intuitively grammatical while being
    difficult to understand, and conversely.

90
Grammaticality ? Parsability
  • It is straightforward enough to show that
    sentence parsing and grammaticality judgments are
    different. There are sentences which are easy to
    parse but ungrammatical (e.g. that-trace
    effects), and there are sentences which are
    extremely difficult to parse, but which may be
    judged grammatical given appropriate time for
    reflection (e.g. multiply center embedded
    sentences). This classic argument shows that
    parsing and grammar are not identical, but it
    tells us very little about just how much they
    have in common.
  • (Phillips, 1995)

91
Grammaticality ? Parsability
  • Grammatical sentences that are hard to parse
  • The cat the dog the rat bit chased fled
  • John gave the man the dog bit a sandwich
  • Ungrammatical sentences that are understandable
  • Who do you think that left?
  • The children is happy
  • The millionaire donated the museum a painting

92
Grammaticality ? Parsability
  • Grammatical sentences that are hard to parse
  • The cat the dog the rat bit chased fled
  • John gave the man the dog bit a sandwich
  • Can arise independent of grammar
  • Resource (memory) limitations
  • Incorrect choices in ambiguity

93
(Preliminary)
  • Incomplete structural dependencies have a cost
    (thats what yields center embedding)

94
A Contrast (Gibson 1998)
  • Relative Clause within a Sentential Complement
    (RC? SC)
  • The fact CP that the employee RC who the
    manager hired stole office supplies worried the
    executive.
  • Sentential Complement within a Relative Clause
    (SC ? RC)
  • The executive RC who the fact CP that the
    employee stole office supplies worried hired
    the manager.
  • RC ? SC is easier to process than SC ? RC

95
A Contrast (Gibson 1998)
  • Relative Clause within a Sentential Complement
    (RC? SC)
  • SC that the employee RC who the manager
    hired stole
  • Sentential Complement within a Relative Clause
    (SC ? RC)
  • RC who the fact SC that the employee stole
    office supplies worried
  • RC ? SC is easier to process than SC ? RC

96
A Contrast (Gibson 1998)
  • Relative Clause within a Sentential Complement
    (RC? SC)
  • SC that the employee RC who the manager
    hired stole
  • Sentential Complement within a Relative Clause
    (SC ? RC)
  • RC who the fact SC that the employee stole
    office supplies worried
  • RC ? SC is easier to process than SC ? RC

97
A Contrast (Gibson 1998)
  • Relative Clause within a Sentential Complement
    (RC? SC)
  • SC that the employee RC who the manager
    hired stole
  • Sentential Complement within a Relative Clause
    (SC ? RC)
  • RC who the fact SC that the employee stole
    office supplies worried
  • Contrast is motivated by off-line complexity
    ratings

98
Grammaticality ? Parsability
  • Ungrammatical sentences that are understandable
  • Who do you think that left?
  • The children is happy
  • The millionaire donated the museum a painting
  • System can represent illegal combinations (e.g.
    categories are appropriate, but feature values
    are inappropriate)
  • Fact that understandable errors are (i)
    diagnosable, (ii) nearly grammatical, should not
    be overlooked

99
Grammaticality ? Parsability
  • Are the parsers operations fully grammatically
    accurate?

100
Standard View
specialized algorithm
specialized algorithm
understanding
speaking
well-adapted toreal-time operationbut maybe
inaccurate
grammaticalknowledge,competence
language
recursive characterization ofwell-formed
expressions
101
Grammatical Accuracy in Parsing
  • The grammar looks rather like a parser
  • BUT, does the parser look like a grammar?i.e.,
    are the parsers operations fully grammatically
    accurate at every step even in situations
    where such accuracy appears quite difficult to
    achieve

(Phillips Wong 2000)
102
Self-Paced Reading
-- --- ------- ------- ---- --- ----.
(e.g. Phillips Wong 2000)
103
Self-Paced Reading
We --- ------- ------- ---- --- ----.
(e.g. Phillips Wong 2000)
104
Self-Paced Reading
-- can ------- ------- ---- --- ----.
(e.g. Phillips Wong 2000)
105
Self-Paced Reading
-- --- measure ------- ---- --- ----.
(e.g. Phillips Wong 2000)
106
Self-Paced Reading
-- --- ------- reading ---- --- ----.
(e.g. Phillips Wong 2000)
107
Self-Paced Reading
-- --- ------- ------- time --- ----.
(e.g. Phillips Wong 2000)
108
Self-Paced Reading
-- --- ------- ------- ---- per ----.
(e.g. Phillips Wong 2000)
109
Self-Paced Reading
-- --- ------- ------- ---- --- word.
(e.g. Phillips Wong 2000)
110
Grammatical Accuracy in Parsing
Wh-Questions
(Phillips Wong 2000)
111
Grammatical Accuracy in Parsing
Wh-Questions
Englishmen cook wonderful dinners.
(Phillips Wong 2000)
112
Grammatical Accuracy in Parsing
Wh-Questions
Englishmen cook wonderful dinners.
(Phillips Wong 2000)
113
Grammatical Accuracy in Parsing
Wh-Questions
Englishmen cook what
(Phillips Wong 2000)
114
Grammatical Accuracy in Parsing
Wh-Questions
Englishmen cook what
(Phillips Wong 2000)
115
Grammatical Accuracy in Parsing
Wh-Questions
What do Englishmen cook
(Phillips Wong 2000)
116
Grammatical Accuracy in Parsing
Wh-Questions
What do Englishmen cook gap

(Phillips Wong 2000)
117
Grammatical Accuracy in Parsing
Wh-Questions
?
What do Englishmen cook gap

(Phillips Wong 2000)
118
Grammatical Accuracy in Parsing
Long-distance Wh-Questions
Few people think that anybody realizes that
Englishmen cook wonderful dinners
(Phillips Wong 2000)
119
Grammatical Accuracy in Parsing
Long-distance Wh-Questions
Few people think that anybody realizes that
Englishmen cook what
(Phillips Wong 2000)
120
Grammatical Accuracy in Parsing
Long-distance Wh-Questions
What do few people think that anybody realizes
that Englishmen cook gap
?
(Phillips Wong 2000)
121
Grammatical Accuracy in Parsing
Parasitic Gaps
The plan to remove the equipment ultimately
destroyed the building.
(Phillips Wong 2000)
122
Grammatical Accuracy in Parsing
Parasitic Gaps
The plan to remove the equipment ultimately
destroyed the building.
Direct Object NP
Direct Object NP
(Phillips Wong 2000)
123
Grammatical Accuracy in Parsing
Parasitic Gaps
The plan to remove the equipment ultimately
destroyed the building.
Direct Object NP
Direct Object NP
Main Clause
(Phillips Wong 2000)
124
Grammatical Accuracy in Parsing
Parasitic Gaps
Subject NP
The plan to remove the equipment ultimately
destroyed the building.
Direct Object NP
Direct Object NP
Main Clause
Embedded Clause
(Phillips Wong 2000)
125
Grammatical Accuracy in Parsing
Parasitic Gaps
What did the plan to remove the equipment
ultimately destroy
(Phillips Wong 2000)
126
Grammatical Accuracy in Parsing
Parasitic Gaps
?
What did the plan to remove the equipment
ultimately destroy gap
(Phillips Wong 2000)
127
Grammatical Accuracy in Parsing
Parasitic Gaps
What did the plan to remove
ultimately destroy the building
(Phillips Wong 2000)
128
Grammatical Accuracy in Parsing
Parasitic Gaps
?
What did the plan to remove gap
ultimately destroy the building
(Phillips Wong 2000)
129
Grammatical Accuracy in Parsing
Parasitic Gaps
Subject
?
What did the plan to remove gap
ultimately destroy the building
Island Constraint A wh-phrase cannot be moved out
of a subject.
(Phillips Wong 2000)
130
Grammatical Accuracy in Parsing
Parasitic Gaps
What did the plan to remove
ultimately destroy
(Phillips Wong 2000)
131
Grammatical Accuracy in Parsing
Parasitic Gaps
What did the plan to remove
ultimately destroy
(Phillips Wong 2000)
132
Grammatical Accuracy in Parsing
Parasitic Gaps
?
?
What did the plan to remove
ultimately destroy
Parasitic Gap
Generalization the good gap rescues the bad gap
133
Grammatical Accuracy in Parsing
Parasitic Gaps
Infinitive
?
?
What did the plan to remove
ultimately destroy
Generalization the good gap rescues the bad gap
134
Grammatical Accuracy in Parsing
Parasitic Gaps
Finite
?
?
What did the plan that removed
ultimately destroy
Revised Generalization (informal) Only mildly bad
gaps can be rescued by good gaps.
135
Grammaticality Ratings
Ratings from 50 subjects
136
Grammatical Accuracy in Parsing
A Look-Ahead Problem
Infinitive
?
?
What did the plan to remove
ultimately destroy
The good gap rescues the bad gap BUT The bad gap
appears before the good gap a look-ahead problem
137
Grammatical Accuracy in Parsing
A Look-Ahead Problem
Infinitive
?
?
What did the plan to remove
ultimately destroy
Embedded Verb
Question When the parser reaches the embedded
verb, does it construct a dependency - even
though the gap would be a bad gap?
138
Grammatical Accuracy in Parsing
A Look-Ahead Problem
Infinitive
?
?
What did the plan to remove
ultimately destroy
Risky
Finite
?
?
What did the plan that removed
ultimately destroy
Reckless
139
Grammatical Accuracy in Parsing
Question
What do speakers do when they get to the verb
embedded inside the subject NP? (i) RISKY create
a gap in infinitival clauses only - violates a
constraint, but may be rescued (ii) RECKLESS
create a gap in all clause types - violates a
constraint cannot be rescued (iii) CONSERVATIVE
do not create a gap
(Phillips Wong 2000)
140
Grammatical Accuracy in Parsing
Materials
a. what infinitival verb ... infinitive, gap
ok b. whether ..infinitival verb
... infinitive, no gap c. what finite verb
... finite, gap not ok d. whether finite
verb ... finite, no gap
(Phillips Wong 2000)
141
Grammatical Accuracy in Parsing
Materials
a. what infinitival verb ... infinitive, gap
ok b. whether ..infinitival verb
... infinitive, no gap c. what finite verb
... finite, gap not ok d. whether finite
verb ... finite, no gap
Gap here RISKY
(Phillips Wong 2000)
142
Grammatical Accuracy in Parsing
Materials
a. what infinitival verb ... infinitive, gap
ok b. whether ..infinitival verb
... infinitive, no gap c. what finite verb
... finite, gap not ok d. whether finite
verb ... finite, no gap
Gap here RECKLESS
(Phillips Wong 2000)
143
Grammatical Accuracy in Parsing
Materials
a. The outspoken environmentalist worked to
investigate what the local campaign to preserve
the important habitats had actually harmed in the
area that the birds once used as a place for
resting while flying south. infinitive,
gap b. whether the local campaign to preserve
infinitive, no gap c. what the local campaign
that preserved finite, gap d. whether the
local campaign that preserved finite, no gap
(Phillips Wong 2000)
144
Grammatical Accuracy in Parsing
Infinitive
?
?
What did the plan to remove
ultimately destroy
(Phillips Wong 2000)
Risky
145
Grammatical Accuracy in Parsing
Finite
?
?
What did the plan that removed
ultimately destroy
(Phillips Wong 2000)
Reckless
146
Grammatical Accuracy in Parsing
Conclusion
  • Structure-building is extremely grammatically
    accurate, even when the word-order of a language
    is not cooperative
  • Constraints on movement are violated in exactly
    the environments where the grammar allows the
    violation to be forgiven (may help to explain
    discrepancies in past studies)
  • Such accuracy is required if grammatical
    computation is to be understood as real-time
    on-line computation

147
(No Transcript)
148
Arguments for Architecture
  • 1. Available grammars dont make good parsing
    devices
  • 2. Grammaticality ? Parsability
  • 3. Failure of DTC
  • 4. Evidence for parser-specific structure
  • 5. Parsing/production have distinct properties
  • 6. Possibility of independent damage to
    parsing/production
  • 7. Competence/performance distinction is
    necessary, right?

149
Derivational Theory of Complexity
  • The psychological plausibility of a
    transformational model of the language user would
    be strengthened, of course, if it could be shown
    that our performance on tasks requiring an
    appreciation of the structure of transformed
    sentences is some function of the nature, number
    and complexity of the grammatical transformations
    involved. (Miller Chomsky 1963 p. 481)

150
Miller (1962)
  • 1. Mary hit Mark. K(ernel)2. Mary did not hit
    Mark. N3. Mark was hit by Mary. P4. Did
    Mary hit Mark? Q5. Mark was not hit by
    Mary. NP6. Didnt Mary hit Mark? NQ7. Was
    Mark hit by Mary? PQ8. Wasnt Mark hit by
    Mary? PNQ

151
Miller (1962)
Transformational Cube
152
Derivational Theory of Complexity
  • Miller McKean (1964) Matching sentences with
    the same meaning or kernel
  • Joe warned the old woman. KThe old woman was
    warned by Joe. P 1.65s
  • Joe warned the old woman. KJoe didnt warn the
    old woman. N 1.40s
  • Joe warned the old woman. KThe old woman wasnt
    warned by Joe. PN 3.12s

153
McMahon (1963)
  • a. i. seven precedes thirteen K (true)
  • ii. thirteen precedes seven K (false)
  • b. i. thirteen is preceded by seven P (true)
  • ii. seven is preceded by thirteen P (false)
  • c. i. thirteen does not precede seven N (true)
  • ii. seven does not precede thirteen N (false)
  • d. i. seven is not preceded by thirteen PN (true)
  • ii. thirteen is not preceded by seven PN (false)

154
Easy Transformations
  • Passive
  • The first shot the tired soldier the mosquito bit
    fired missed.
  • The first shot fired by the tired soldier bitten
    by the mosquito missed.
  • Heavy NP Shift
  • I gave a complete set of the annotated works of
    H.H. Munro to Felix.
  • I gave to Felix a complete set of the annotated
    works of H.H. Munro.
  • Full Passives
  • Fido was kissed (by Tom).
  • Adjectives
  • The red house/house which is red is on fire.

155
Failure of DTC?
  • Any DTC-like prediction is contingent on a
    particular theory of grammar, which may be wrong
  • Its not surprising that transformations are not
    the only contributor to perceptual complexity
  • memory demands, may increase or decrease
  • ambiguity, where grammar does not help
  • difficulty of access

156
(No Transcript)
157
Arguments for Architecture
  • 1. Available grammars dont make good parsing
    devices
  • 2. Grammaticality ? Parsability
  • 3. Failure of DTC
  • 4. Evidence for parser-specific structure
  • 5. Parsing/production have distinct properties
  • 6. Possibility of independent damage to
    parsing/production
  • 7. Competence/performance distinction is
    necessary, right?

158
Garden Paths Temporary Ambiguity
  • The horse raced past the barn fell.
  • Weapons test scores a hit.
  • John gave the man the dog bit a sandwich.
  • Grammar can account for the existence of global
    ambiguities (e.g. Visiting relatives can be
    boring), but not local ambiguities since the
    grammar does not typically assemble structure
    incrementally

159
Garden Paths Temporary Ambiguity
  • Ambiguity originally studied as test of solution
    to the incrementality problem
  • Heuristics Strategies (e.g. Bever, 1970)
  • NP V gt subject verb
  • V NP gt verb object
  • V NP NP gt verb object object
  • Garden paths used as evidence for effects of
    heuristics

160
Garden Paths Temporary Ambiguity
  • Heuristics Strategies
  • NP V gt subject verbThe horse raced past the
    barn fell
  • V NP gt verb objectThe student knew the answer
    was wrong
  • V NP NP gt verb object objectJohn gave the man
    the dog bit a sandwich

161
Ambiguity Resolution
  • Observation heuristics miss a generalization
    about how ambiguities are preferentially resolved
  • Kimball (1973) Seven principles of surface
    structure parsing (e.g. Right Association)
  • Frazier (1978), Fodor Frazier (1978) Minimal
    Attachment, Late Closure
  • Various others, much controversy...

162
Ambiguity Resolution
  • Assumptions
  • grammatical parses are accessed (unclear how)
  • simplest analysis of ambiguity chosen
    (uncontroversial)
  • structural complexity affects simplicity (partly
    controversial)
  • structural complexity determines simplicity (most
    controversial)

163
Ambiguity Resolution
  • Relevance to architecture of language
  • Comprehension-specific heuristics which
    compensate for inadequacy of grammar imply
    independent system
  • Comprehension-specific notions of structural
    complexity compatible with independent system
  • If grammar says nothing about ambiguity, and
    structural complexity is irrelevant to ambiguity
    resolution, as some argue, then ambiguity is
    irrelevant to question of parser-grammar
    relations.

164
(No Transcript)
165
Arguments for Architecture
  • 1. Available grammars dont make good parsing
    devices
  • 2. Grammaticality ? Parsability
  • 3. Failure of DTC
  • 4. Evidence for parser-specific structure
  • 5. Parsing/production have distinct properties
  • 6. Possibility of independent damage to
    parsing/production
  • 7. Competence/performance distinction is
    necessary, right?

166
Parsing ? Production
  • Parsing generates meaning from form
  • Production generates form from meaning
  • Different bottlenecks in the two areas
  • garden paths in comprehension
  • word-category constraint in production errors
  • etc., etc.
  • Lexical access speaking and recognizing words
    differs, but do we assume that this reflects
    different systems?
  • Contemporary production theories are now
    incremental structure-building systems, more
    similar to comprehension models

167
(No Transcript)
168
Arguments for Architecture
  • 1. Available grammars dont make good parsing
    devices
  • 2. Grammaticality ? Parsability
  • 3. Failure of DTC
  • 4. Evidence for parser-specific structure
  • 5. Parsing/production have distinct properties
  • 6. Possibility of independent damage to
    parsing/production
  • 7. Competence/performance distinction is
    necessary, right?

169
Competence Performance
  • Different kinds of formal systems Competence
    systems and Performance systems
  • The difference between what a system can generate
    given unbounded resources, and what it can
    generate given bounded resources
  • The difference between a cognitive system and its
    behavior

170
Competence Performance
  • (1) Its impossible to deny the distinction
    between cognitive states and actions, the
    distinction between knowledge and its deployment.
  • (2) How to distinguish ungrammatical-but-comprehen
    sible examples (e.g. John speaks fluently
    English) from hard-to-parse examples.
  • (3) How to distinguish garden-path sentences
    (e.g. The horse raced past the barn fell) from
    ungrammatical sentences.
  • (4) How to distinguish complexity overload
    sentences (e.g. The cat the dog the rat chased
    saw fled) from ungrammatical sentences.

171
Competence Performance
  • It is straightforward enough to show that
    sentence parsing and grammaticality judgments are
    different. There are sentences which are easy to
    parse but ungrammatical (e.g. that-trace
    effects), and there are sentences which are
    extremely difficult to parse, but which may be
    judged grammatical given appropriate time for
    reflection (e.g. multiply center embedded
    sentences). This classic argument shows that
    parsing and grammar are not identical, but it
    tells us very little about just how much they
    have in common.
  • (Phillips, 1995)
  • This argument is spurious!

172
(No Transcript)
173
Summary
  • Motivation for combining learning theories with
    theories of adult knowledge is well-understood
    much more evidence needed.
  • Theories of comprehension and production long
    thought to be independent of competence models.
    In fact, combination of these is quite feasible
    if true, possible to investigate linguistic
    knowledge in real time.
Write a Comment
User Comments (0)
About PowerShow.com