Title: Phonological constraints as filters in SLA
1Phonological constraints as filters in SLA
- Raung-fu Chung
- rfchung_at_mail.nsysu.edu.tw
21. Introduction
- The main components of this article are
- The framework of Optimality Theory
- Acquisition and learnability in OT
- Our model
- Concluding remarks
32. The framework of Optimality Theory
4- For instance, the English morpheme for plurals
/s/ can be realized either as s, z, depending
on the preceding sound of the stem
5- (2)
- cat k?? cats k??s
- dog d?g dogs d?gz
- hen hen hens henz
- The input form is taken to be s. Then we
propose the following constraint - Voiced Obstruent Prohibition)?
6- (3) Voiced Obstruent Prohibition, VOP
- No obstruents can be voiced.
7- Another constraint called for is
- (4) Obstruent Voicing Harmony, OVH
- The adjacent obstruents should share the same
value for voice - The third constraint is a universal constraint,
which is one component of IO, here referred as
Ident-IO(voice) Ident identical, IO Input and
Output) - (5) Ident-IO(voice)
- The value for voice feature of the Output
should be identical with tha of the Input.
8- As for the ranking, it is obvious, as shown
below. - (6) OVH gtgt VOP (?gtgt? be preceded or be prior to)
- Adding to IO, we have the following raking for
all the three constraints we just proposed. - (54) OVHgtgt Id-IO(voice)gtgt VOP
9/d?g-z/ OVH Id-IO(voice) VOP
? a. d?g-z
b. d?g-s !
c. d?k-s !
103. Acquisition and learnability in OT
113.1 The notion of learnability
- a. The formal learnability in the sense of Tesar
Smolensky, 1993 assumed that all constraints
started out being unranked. In later empirical
studies (e.g. Gnanadesikan, 1996 Levelt, 1995),
it is pointed out that outputs are initially
governed by markedness constraints, rather than
by faithfulness constraints. This leads to the
proposal that in the initial state of the
grammar, all markedness constraints outrank all
faithfulness constraints, or M gtgt F for short
(Kager, Pater Zonneveld, 2004 Hayes, 2004
Prince Tesar, 2004).
12- b. There are two algorithms accounting for
learnability of constraint rankings Constraint
Demotion Algorithm (CDA) and Gradual Learning
Algorithm (GLM). Constraint Demotion Algorithm
(CDA), proposed by Tesar Smolensky (1993, 1998,
2000), ranks a set of constraints based on the
positive input. For example, L1 acquisition can
be interpreted as constraint demotion (Tesar
Smolensky, 1996)
13- c. Gradual Learning Algorithm (GLA), developed in
Boersma (1997, 1998) and Boersma Hayes (2001),
handles variation in the input and explains
gradual well-formedness. GLA is helpful in
accounting for categorization errors a learner
makes in both production and perception. L2
learners with restricted constraint sets have to
gradually learn to rerank the constraints by
raising or lowering the existing ones.
144. Our model
L1 filter
Interlanguage- ranking
15- Empirical arguemtns
- An VOT-baased analysis of VOT production by
Taiwanese EFL learners - An diphthong construction of Mandarin and English
for Taiwanese learners - Errors of production of yi and wu for
Mandarin EFL leasrners
16An OT-based Analysis of VOT Productionby
Taiwanese EFL Learners
- Acoustic values of VOT (Liou, 2005)
- Note NSE native speakers of English HEFL high
proficient EFL learners LEFL low proficient EFL
learners MAN Mandarin SM Southern Min
17Constraints for VOT
- 1. the CATEG(ORIZE) family, which punishes
productive categories with certain acoustic
values. For example, CATEG(VOT /91.5ms/) is
against producing /91.5ms/ as a particular
category. - 2. WARP family, which demands every segment be
produced as a member of the most similar
available category. For instance, WARP(VOT
9.3ms) requires that an acoustic segment with a
VOT of 91.5ms should not be produced as any VOT
category that is 9.3ms off (or more), i.e. as
/82.2ms/ or /100.8ms/ or anything even farther
away.
18Constraint-ranking for English ph by NSE
- Tableau 1 English ph of NSE
19Constraint-ranking for ? by Taiwanese EFL
learners
- Tableau 2 Mandarin ? by Taiwanese EFL learners
75.4 ms Intended(?) CATEG (/82.2/) CATEG (/91.5/) WARP (16.1) WARP (6.8) CATEG (/75.4/)
a. 91.5ms !
b. 82.2ms !
c. ? 75.4ms
20Constraint-ranking for Interlanguage ph by
Taiwanese EFL learners
- Tableau 3 Interlanguage ph by HEFL
91.5 ms Intendedph WARP (16.1) CATEG (/91.5/) CATEG (/75.4/) CATEG (/82.2/) WARP (9.3)
a. ? 91.5ms !?
b. ? 82.2ms ? ?
c. 75.4ms !? !?
21Tableau 4 Interlanguage ph by LEFL
91.5 ms Intendedph WARP (16.1) CATEG (/91.5/) CATEG (/75.4/) CATEG (/78.7/) WARP (12.8)
a. ? 91.5ms !?
b. ? 78.7ms ? ?
c. 75.4ms !? !?
22An OT-based Analysis of Mandarin and English
diphthongs for Taiwanese EFL (MSL) learners
23Mandarin vowels Mandarin vowels Mandarin vowels SM vowels SM vowels SM vowels
i u_ u i u
e o e o
a a
24Mandarin diphthong construcion principle Mandarin diphthong construcion principle Mandarin diphthong construcion principle Mandarin diphthong construcion principle
Front back Front back Front back
i u
e o
-back vowels for
ie ie (?)?t?ie (?)
ei pei (?)?kei (?)
back vowels for
uo uo (?)?kuo (?)
ou ou(?)?kou (?)
25- (4)
- N (N??,????)
- ? ? -? ?
26SM vowels SM vowels SM vowels
front back front back front back
i u
e o
a
Different back featues for
iu iu (?)?kiu (?)
ui ui (?)?kui (?)
io ?? kio (?)
io (?)
ue ?? kue (?)
hue (?)
27- (9)
- N (Nnucleus,?same,ungramm
atical) - ? ? ? ?
28Input violation adjustment results samples
a. /ie/ Same front features ideleted e ?????en
a. /ie/ Same front features ideleted e ????? e
b. /ei/ Same front features i deleted e ????? pe
b. /ei/ Same front features i deleted e ????? ke
c. /uo/ Same back features u deleted o ????? o
c. /uo/ Same back features u deleted o ????? to
d. /ou/ Same back features u deleted o ????? o
d. /ou/ Same back features u deleted o ????? to
295. Concluding remarks
- theoretical implications
- empirical supports
30The end