Phonological constraints as filters in SLA - PowerPoint PPT Presentation

1 / 30
About This Presentation
Title:

Phonological constraints as filters in SLA

Description:

Phonological constraints as filters in SLA Raung-fu Chung rfchung_at_mail.nsysu.edu.tw 1. Introduction The main components of this article are: The framework of ... – PowerPoint PPT presentation

Number of Views:135
Avg rating:3.0/5.0
Slides: 31
Provided by: user152
Category:

less

Transcript and Presenter's Notes

Title: Phonological constraints as filters in SLA


1
Phonological constraints as filters in SLA
  • Raung-fu Chung
  • rfchung_at_mail.nsysu.edu.tw

2
1. Introduction
  • The main components of this article are
  • The framework of Optimality Theory
  • Acquisition and learnability in OT
  • Our model
  • Concluding remarks

3
2. The framework of Optimality Theory
  • (1) The model of OT

4
  • For instance, the English morpheme for plurals
    /s/ can be realized either as s, z, depending
    on the preceding sound of the stem

5
  • (2)
  • cat k?? cats k??s
  • dog d?g dogs d?gz
  • hen hen hens henz
  • The input form is taken to be s. Then we
    propose the following constraint
  • Voiced Obstruent Prohibition)?

6
  • (3) Voiced Obstruent Prohibition, VOP
  • No obstruents can be voiced.

7
  • Another constraint called for is
  • (4) Obstruent Voicing Harmony, OVH
  • The adjacent obstruents should share the same
    value for voice
  • The third constraint is a universal constraint,
    which is one component of IO, here referred as
    Ident-IO(voice) Ident identical, IO Input and
    Output)
  • (5) Ident-IO(voice)
  • The value for voice feature of the Output
    should be identical with tha of the Input.

8
  • As for the ranking, it is obvious, as shown
    below.
  • (6) OVH gtgt VOP (?gtgt? be preceded or be prior to)
  • Adding to IO, we have the following raking for
    all the three constraints we just proposed.
  • (54) OVHgtgt Id-IO(voice)gtgt VOP

9
  • (55)

/d?g-z/ OVH Id-IO(voice) VOP
? a. d?g-z
b. d?g-s !
c. d?k-s !
10
3. Acquisition and learnability in OT
11
3.1 The notion of learnability
  • a. The formal learnability in the sense of Tesar
    Smolensky, 1993 assumed that all constraints
    started out being unranked. In later empirical
    studies (e.g. Gnanadesikan, 1996 Levelt, 1995),
    it is pointed out that outputs are initially
    governed by markedness constraints, rather than
    by faithfulness constraints. This leads to the
    proposal that in the initial state of the
    grammar, all markedness constraints outrank all
    faithfulness constraints, or M gtgt F for short
    (Kager, Pater Zonneveld, 2004 Hayes, 2004
    Prince Tesar, 2004).

12
  • b. There are two algorithms accounting for
    learnability of constraint rankings Constraint
    Demotion Algorithm (CDA) and Gradual Learning
    Algorithm (GLM). Constraint Demotion Algorithm
    (CDA), proposed by Tesar Smolensky (1993, 1998,
    2000), ranks a set of constraints based on the
    positive input. For example, L1 acquisition can
    be interpreted as constraint demotion (Tesar
    Smolensky, 1996)

13
  • c. Gradual Learning Algorithm (GLA), developed in
    Boersma (1997, 1998) and Boersma Hayes (2001),
    handles variation in the input and explains
    gradual well-formedness. GLA is helpful in
    accounting for categorization errors a learner
    makes in both production and perception. L2
    learners with restricted constraint sets have to
    gradually learn to rerank the constraints by
    raising or lowering the existing ones.

14
4. Our model
  • L1 Filter Hypothesis OT

L1 filter
Interlanguage- ranking
15
  • Empirical arguemtns
  • An VOT-baased analysis of VOT production by
    Taiwanese EFL learners
  • An diphthong construction of Mandarin and English
    for Taiwanese learners
  • Errors of production of yi and wu for
    Mandarin EFL leasrners

16
An OT-based Analysis of VOT Productionby
Taiwanese EFL Learners
  • Acoustic values of VOT (Liou, 2005)
  • Note NSE native speakers of English HEFL high
    proficient EFL learners LEFL low proficient EFL
    learners MAN Mandarin SM Southern Min

17
Constraints for VOT
  • 1. the CATEG(ORIZE) family, which punishes
    productive categories with certain acoustic
    values. For example, CATEG(VOT /91.5ms/) is
    against producing /91.5ms/ as a particular
    category.
  • 2. WARP family, which demands every segment be
    produced as a member of the most similar
    available category. For instance, WARP(VOT
    9.3ms) requires that an acoustic segment with a
    VOT of 91.5ms should not be produced as any VOT
    category that is 9.3ms off (or more), i.e. as
    /82.2ms/ or /100.8ms/ or anything even farther
    away.

18
Constraint-ranking for English ph by NSE
  • Tableau 1 English ph of NSE

19
Constraint-ranking for ? by Taiwanese EFL
learners
  • Tableau 2 Mandarin ? by Taiwanese EFL learners

75.4 ms Intended(?) CATEG (/82.2/) CATEG (/91.5/) WARP (16.1) WARP (6.8) CATEG (/75.4/)
a. 91.5ms !
b. 82.2ms !
c. ? 75.4ms
20
Constraint-ranking for Interlanguage ph by
Taiwanese EFL learners
  • Tableau 3 Interlanguage ph by HEFL

91.5 ms Intendedph WARP (16.1) CATEG (/91.5/) CATEG (/75.4/) CATEG (/82.2/) WARP (9.3)
a. ? 91.5ms !?
b. ? 82.2ms ? ?
c. 75.4ms !? !?
21
Tableau 4 Interlanguage ph by LEFL
91.5 ms Intendedph WARP (16.1) CATEG (/91.5/) CATEG (/75.4/) CATEG (/78.7/) WARP (12.8)
a. ? 91.5ms !?
b. ? 78.7ms ? ?
c. 75.4ms !? !?
22
An OT-based Analysis of Mandarin and English
diphthongs for Taiwanese EFL (MSL) learners
23
  • (1)

Mandarin vowels Mandarin vowels Mandarin vowels SM vowels SM vowels SM vowels
i u_ u i u
e o e o
a a
24
  • (2)

Mandarin diphthong construcion principle Mandarin diphthong construcion principle Mandarin diphthong construcion principle Mandarin diphthong construcion principle
Front back Front back Front back
i u
e o

-back vowels for
  • (3)

ie ie (?)?t?ie (?)
ei pei (?)?kei (?)
back vowels for
  • (4)

uo uo (?)?kuo (?)
ou ou(?)?kou (?)
25
  • (4)
  • N (N??,????)
  • ? ? -? ?

26
  • (5)

SM vowels SM vowels SM vowels
front back front back front back
i u
e o
a
Different back featues for
  • (6)

iu iu (?)?kiu (?)
ui ui (?)?kui (?)
  • (7)

io ?? kio (?)
io (?)
  • (8)

ue ?? kue (?)
hue (?)
27
  • (9)
  • N (Nnucleus,?same,ungramm
    atical)
  • ? ? ? ?

28
Input violation adjustment results samples
a. /ie/ Same front features ideleted e ?????en
a. /ie/ Same front features ideleted e ????? e
b. /ei/ Same front features i deleted e ????? pe
b. /ei/ Same front features i deleted e ????? ke
c. /uo/ Same back features u deleted o ????? o
c. /uo/ Same back features u deleted o ????? to
d. /ou/ Same back features u deleted o ????? o
d. /ou/ Same back features u deleted o ????? to
29
5. Concluding remarks
  1. theoretical implications
  2. empirical supports

30
The end
Write a Comment
User Comments (0)
About PowerShow.com