On the Mathematical Properties of Linguistic Theories - PowerPoint PPT Presentation

1 / 21
About This Presentation
Title:

On the Mathematical Properties of Linguistic Theories

Description:

Lexical-functional Grammar(LFG) Generalized phrase structure Grammar(GPSG) ... The universal base hypothesis is empirically vacuous. ... – PowerPoint PPT presentation

Number of Views:43
Avg rating:3.0/5.0
Slides: 22
Provided by: aco64
Category:

less

Transcript and Presenter's Notes

Title: On the Mathematical Properties of Linguistic Theories


1
On the Mathematical Properties of Linguistic
Theories
  • ???? ????
  • ?? ???

2
1. Introduction
  • The development of new formalisms for
    metatheories of linguistic theories
  • Decidability
  • Generative capacity
  • Recognition complexity
  • Linguistic theories
  • Context-free Grammar(CFG)
  • Transformational Grammar(TG)
  • Lexical-functional Grammar(LFG)
  • Generalized phrase structure Grammar(GPSG)
  • Tree adjuct Grammar(TAG)
  • Stratificational Grammar(SG)

3
2. Preliminary Definition
  • Elementary Definition from Complexity theory
  • If cw(n) is O(g), then the worst-case time
    complexity is O(g).
  • -gt almost all inputs to M of size n can be
    processed in time Kg(n)
  • A1, A2 are available algorithms for f, O(g1)and
    O(g2) are their worst-case complexity and g1g2
  • -gt A2 will be the preferable algorithm (? K1gtK2)

context-sensitive CS recursively enumerable r.e.
f(x) a recognition function of a language L a recognition function of a language L a recognition function of a language L
M an algorithm for f an algorithm for f an algorithm for f
c(x) the cost(time and space) of executing M on a specific input x the cost(time and space) of executing M on a specific input x the cost(time and space) of executing M on a specific input x
cw a function whose argument is n(the size of the input to M) a function whose argument is n(the size of the input to M) a function whose argument is n(the size of the input to M)
cw(n) the maximum of c(x), the worst-case complexity function for M the maximum of c(x), the worst-case complexity function for M the maximum of c(x), the worst-case complexity function for M
ce(n) the average of c(x) over all inputs of length n for M, the expected complexity function the average of c(x) over all inputs of length n for M, the expected complexity function the average of c(x) over all inputs of length n for M, the expected complexity function
f is O(g) ngtn0 and f(n)ltKg(n) (K a constant) ngtn0 and f(n)ltKg(n) (K a constant) ngtn0 and f(n)ltKg(n) (K a constant)
4
2. Preliminary Definition
  • Two machine models
  • Sequential models(Aho et al. 1974)
  • Singletape and multitape Turning machine(TM),
    random-access machines(RAM), random-access
    stored-program machines(RASP)
  • Polynomially related
  • Transforming a sequential algorithm to parallel
    one improves at most a factor K improvement in
    speed
  • Parallel models
  • Polynomial number of processors and circuit depth
    O(s2)

5
3. Context-Free Languages
  • Recognition techniques for CFL
  • CKY or Dynamic programming(Hays, J.Cocke, Kasami,
    Younger)
  • Requires grammar in Chomsky Normal Form
  • Squares size of input of n length
  • Earleys Algorithm
  • Recognizes CFG in time O(n3) and space O(n2) and
    unambiguous CFG in time O(n2)
  • Ruzzo(1979)
  • Boolean circuits in depth of O(log(n)2)
  • Parallel recognition is accomplished in
    O(log(n)2) time
  • C.f. Possible number of parses in some
    grammatical sentences of length n 2n(Church and
    Patil 1982)

6
4. Transformational Grammar
  • Peters and Ritchie(1973a)
  • Reflects transformations that move, add and
    delete constituents which are recoverable
  • Every r.e. set can be generated by applying a
    set of transformations to CS.
  • The base grammar can be independent of the
    language being generated.
  • The universal base hypothesis is empirically
    vacuous.
  • If S is recursive in the CF base, then L is
    predictable enumerable and exponentially bounded.
  • If all recursion in the base grammar passes
    through S and all derivation satisfy the
    terminal-length-increasing condition, then the
    generated language is recursive.

7
4. Transformational Grammar
  • Rounds(1975)
  • Language recognition and generation for every
    recognizable language in exponential time are
    done in exponential time under the
    terminal-length-nondecreasing condition and
    recoverability deletion
  • NP-complete problems

8
4. Transformational Grammar
  • Berwick
  • A formalization reduces grammaticality to
    well-formedness conditions on the surface
    structure is unusual.
  • In GB grammar G, surface structure s, yield of s
    w, a constant K
  • -gt the number of node in s Klength(w)
  • GB languages have the linear growth or arithmetic
    growth property
  • Problems in Berwicks
  • The formalization is a radical simplification
  • Recognition complexity under other constraints
  • No immediate functional for complexity or for
    weak generation capacity.

9
5. Lexical-Functional Grammar
  • Kaplan and Bresnan(1982)
  • Without making use of transformation
  • Two levels of syntactic structure Constituent
    structure and Functional structure
  • Berwick(1982)
  • A set of strings whose recognition problem is
    NP-complete is and LFL.
  • The complexity of LFG comes in finding the
    assignment of truth-values to the variables.

10
6. Generalized Phrase Structure Grammar
  • Gerald Gazdar(1982)
  • ???? ?? ??? ?? ???? ??? ??? ??
  • ???(unification)? ??? ?? ?? ??? ??
  • ?? ??(universal principle)? ??? ??(formal
    constraint)

11
6.1. Node admissibility
  • Interpretation of Context-Free rules
  • Rewriting rules
  • Constraints
  • Rounds(1970)
  • Top-down FSTA
  • (q, a, n) gt (q1, , qn)
  • Bottom-up FSTA
  • (a, n, (q1, , qn) gt q)

12
6.2. Metarules
  • Gazdar(1982)
  • Rules that apply to rules to produce other rules
  • E.g. Passive metarules
  • W ?? ?? ??(Multiple variable)
  • VP -gt H2, NP The beast ate the meat.
  • VP -gt H3, NP, PPto Lee gave this to Sandy.
  • VPPAS -gt H2, (PP(by) The meat was eaten by
    the beast.
  • VPPAS -gt H3, PPto, (PP(by) This was given
    to Sandy by Lee.

VP W, NP VPPAS W,
(PPby)
13
6.2. Metarules
  • Two devices(or constraints) in metarules
  • Essential variables
  • Phantom categories

14
7. Tree Adjunct Grammar
  • Joshi(1982, 1984)
  • A TAG consists of two finite sets of finite
    trees, the center trees and the adjunct trees.
  • Adjunction operation
  • CFLs ? TALs ? indexed languages ? CSLs

c
a
c
A
t
a
A
A

gt
n
t
15
8. Stratificational Grammar
  • The Stratification Grammar(Lamb 1966, Gleason
    1964)
  • Strata
  • Linearly ordered and constrained by a
    realization relation
  • Realization relation
  • Application of specific pairs of products in the
    different grammar (e.g. Pairing of syntactic and
    semantic rules (Montague))
  • Two-level stratificiational grammar
  • Rewriting grammar G1 and G2
  • Relation R a finite set of pairs(strings P1,
    P2)
  • D1 in G1 is realized by a derivation D2 in G2
  • if s1 and s2 can be decomposed into substrings
    s1u1.un, s2v1.vn R(ui, vi)

16
9. Seeking Significance
  • How to select the most useful metatheorical
    results among syntactic theories?
  • gt To claim that the computationally most
    restrictive theory is preferable!

17
9.1. Coverage
  • Scope(Linebarger 1980)
  • An item is in the immediate scope of NOT if
  • (1) it occurs only in the proposition which is
    the entire scope of NOT
  • (2) within the proposition there are no logical
    elements intervening between it and NOT
  • Polarity reverser(Ladusaw 1979)
  • 1. A negative polarity item will be acceptable
    only if it is in the scope of a
    polarity-reversing expression
  • 2. For any two expressions a and ß, constituent
    of a sentence S, a is in the scope ofß with
    respect to a composition structure of S, S, iff
    the interpretation of a is used in the
    formulation of the argument ßs interpretation in
    S
  • 3. An expression D is a polarity reverser with
    respect to an interpretation function F if and
    only if, for all expressions X and Y,
  • F(X) ? F(Y) gt F(d(Y)) ? F(d(X))

18
9.1. Coverage
  • Constraint separation
  • Syntax-semantics boundary (e.g.
    polarity-sensitive)
  • Syntax(e.g. GB, LFG)
  • Separation sometimes has beneficial computational
    effect.
  • e.g. Separating constraints imposed by CFGs from
    constraints by indexed grammar
  • gt recognition complexity remains low-order
    polynomial

19
9.2. Metatheoretical results as lower bounds
  • What are the minimal generative capacity and
    recognition complexity of actual languages?

20
9.3. Metatheoretical results as upper bounds
  • The class of possible languages could contain
    languages that are now recursive.
  • Putnam(1961)
  • Languages might just happen to be recursive.
  • Peters and Ritchie(1973)
  • 1. Every TG has an exponentially bounded cycling
    function, and thus generates only recursive
    languages,
  • 2. Every natural language has a descriptive
    adquate TG
  • 3. The complexity of language investigated so far
    is typical of the class

21
9.3. Metatheoretical results as upper bounds
  • O(g)-result
  • Asymptotic worst-case measures.
  • Depends on machine model and RAMs.
Write a Comment
User Comments (0)
About PowerShow.com