In Defense of Contextual Vocabulary Acquisition: - PowerPoint PPT Presentation

About This Presentation
Title:

In Defense of Contextual Vocabulary Acquisition:

Description:

SNeBR (ATMS; Martins & Shapiro 88): If inference leads to a ... SNePSwD (SNePS w/ Defaults; Martins & Cravo 91) Previously used to automate step 1, above; ... – PowerPoint PPT presentation

Number of Views:22
Avg rating:3.0/5.0
Slides: 32
Provided by: rapa
Learn more at: https://cse.buffalo.edu
Category:

less

Transcript and Presenter's Notes

Title: In Defense of Contextual Vocabulary Acquisition:


1
In Defense of Contextual Vocabulary Acquisition
  • How to Do Thingswith Words in Context
  • William J. Rapaport
  • Department of Computer Science Engineering,
  • Department of Philosophy, and Center for
    Cognitive Science
  • State University of New York at Buffalo
  • Buffalo, NY 14260
  • http//www.cse.buffalo.edu/rapaport/CVA/

2
The meaning of things lies not in themselves but
in our attitudes toward them.- Antoine de
Saint-Exupéry,Wisdom of the Sands (1948)
3
The meaning of things lies not in themselves but
in our attitudes toward them.
words
4
Terminology Meaning of Meaning
  • the meaning of a word vs. a meaning for a
    word
  • the ? single meaning
  • of ? meaning belongs to word
  • a ? many possible meanings depending on
    textual context,
  • readers prior knowledge, etc.
  • for ? reader constructs meaning, gives it
    to word

5
Contextual Vocabulary Acquisition
  • CVA active, deliberate acquisition of a meaning
    for a word in a text by reasoning from
    context
  • context ? textual context
  • surrounding words co-text
  • context wide context
  • internalized co-text
  • readers interpretive mental model of textual
    co-text
  • integrated via belief revision
  • infer new beliefs from internalized co-text
    prior knowledge
  • remove inconsistent beliefs
  • with readers prior knowledge
  • including language knowledge
  • including previous hypotheses about words
    meaning
  • but not including external sources (dictionary,
    humans)
  • ? context for CVA is in readers mind, not in
    the text

6
Overview
  • CVA project
  • computational theory of how to figure out
    (compute)a meaning for an unfamiliar word from
    wide context.
  • convert algorithms to a teachable curriculum
  • Current status
  • Have theory
  • Have computational implementation
  • Know that people do incidental CVA
  • Possibly best explanation of how we learn
    vocabulary
  • given of words high-school grad knows (45K),
    of years to learn them (18) 2.5K words/year
  • but only taught 10 in 12 school years
  • 2 groups of researchers say CVA cant be done
    (well)
  • This talk Why theyre wrong.

7
Note All contextual reasoning is done in this
context
B-R Integrated KB
Text
T1
internalization
PK1 PK2 PK3 PK4
P7
I(T1)
T2
inference
T3
P5
I(T2)
P6
I(T3)
8
What does brachet mean?

9
What Does Brachet Mean?(From Malorys Morte
DArthur page in brackets)
  • 1. There came a white hart running into the
    hall with a white brachet next to him, and thirty
    couples of black hounds came running after them.
    66
  • As the hart went by the sideboard, the white
    brachet bit him. 66
  • The knight arose, took up the brachet and rode
    away with the brachet. 66
  • A lady came in and cried aloud to King Arthur,
    Sire, the brachet is mine. 66
  • There was the white brachet which bayed at him
    fast. 72
  • 18. The hart lay dead a brachet was biting on
    his throat, and other hounds came behind. 86

10
Computational CVA
  • Based on Karen Ehrlichs CS Ph.D. dissertation
    (1995)
  • Implemented in SNePS KRRA system
  • KB SNePS representation of readers prior
    knowledge
  • I/P SNePS representation of word co-text
  • Processing
  • Inferences drawn/belief revision during text
    input
  • Simulates reading
  • N V definition algorithms deductively search
    this belief-revised, integrated KB (the
    context) for definitional information
  • O/P Definition frame
  • slots (features) classes, structure, actions,
    properties, etc.
  • fillers (values) info gleaned from context (
    integrated KB)

11
Cassie learns what brachet meansBackground
info about harts, animals, King Arthur, etc.No
info about brachetsInput formal-language
(SNePS) version of simplified EnglishA hart
runs into King Arthurs hall. In the story, B12
is a hart. In the story, B13 is a hall. In
the story, B13 is King Arthurs. In the story,
B12 runs into B13.A white brachet is next to
the hart. In the story, B14 is a brachet. In
the story, B14 has the property white.
Therefore, brachets are physical objects.
(deduced while reading Cassie believes that
only physical objects have color)
12
--gt (defineNoun "brachet") Definition of
brachet Class Inclusions phys obj, Possible
Properties white, Possibly Similar Items
animal, mammal, deer, horse, pony, dog,
I.e., a brachet is a physical object that can be
white and that might be like an animal,
mammal, deer, horse, pony, or dog
13
A hart runs into King Arthurs hall.A white
brachet is next to the hart.The brachet bites
the harts buttock.--gt (defineNoun "brachet")
Definition of brachet Class Inclusions
animal, Possible Actions bite buttock,
Possible Properties white, Possibly Similar
Items mammal, pony,
14
A hart runs into King Arthurs hall. A white
brachet is next to the hart. The brachet bites
the harts buttock. The knight picks up the
brachet. The knight carries the brachet. --gt
(defineNoun "brachet") Definition of brachet
Class Inclusions animal, Possible Actions
bite buttock, Possible Properties small,
white, Possibly Similar Items mammal, pony,
15
A hart runs into King Arthurs hall.A white
brachet is next to the hart.The brachet bites
the harts buttock.The knight picks up the
brachet.The knight carries the brachet.The lady
says that she wants the brachet.--gt (defineNoun
"brachet") Definition of brachet Class
Inclusions animal, Possible Actions bite
buttock, Possible Properties valuable, small,
white, Possibly Similar Items mammal,
pony,
16
  • A hart runs into King Arthurs hall.A white
    brachet is next to the hart.The brachet bites
    the harts buttock.The knight picks up the
    brachet.The knight carries the brachet.The lady
    says that she wants the brachet.
  • The brachet bays at Sir Tor. background
    knowledge only hunting dogs bay
  • --gt (defineNoun "brachet")
  • Definition of brachet
  • Class Inclusions hound, dog,
  • Possible Actions bite buttock, bay, hunt,
  • Possible Properties valuable, small, white,
  • I.e. A brachet is a hound (a kind of dog) that
    can bite, bay, and hunt,
  • and that may be valuable, small, and white.

17
General Comments
  • Systems behavior ? human protocols
  • Systems definition ? OEDs definition
  • A brachet is a kind of hound which hunts by
    scent

18
How Does Our System Work?
  • Uses a semantic network computer system
  • semantic networks concept maps
  • serves as a model of the reader
  • represents
  • readers prior knowledge
  • the text being read
  • can reason about the text and the readers
    knowledge

19
Fragment of readers prior knowledge m3 In
real life, white is a color m6 In real
life, harts are deer m8 In real life, deer
are mammals m11 In real life, halls are
buildings m12 In real life, b1 is named King
Arthur m14 In real life, b1 is a king (etc.)
20
m16 if v3 has property v2 if v2
is a color if v3 ? v1 then v1 is a kind of
physical object
21
Reading the story m17 In the story, b2 is a
hart m24 In the story, the hart runs into
b3 (b3 is King Arthurs hall) not shown (harts
are deer) not shown
22
The entire network showing the readers mental
context consisting of prior knowledge, the story,
inferences. The definition algorithm searches
this network abstracts parts of it to produce a
(preliminary) definition of brachet.
23
Implementation
  • SNePS (Stuart C. Shapiro SNeRG)
  • Intensional, propositional semantic-network
    knowledge-representation reasoning system
  • Formula-based path-based reasoning
  • I.e., logical inference generalized inheritance
  • SNeBR belief revision system
  • Used for revision of definitions
  • SNaLPS natural-language input/output
  • Cassie computational cognitive agent

24
How It Works
  • SNePS represents
  • background knowledge text information
  • in a single, consolidated semantic network
  • Algorithms deductively search network for
    slot-fillers for definition frame
  • Search is guided by desired slots
  • E.g., prefers general info over particular info,
    but takes what it can get

25
Noun Algorithm
  • Find or infer
  • Basic-level class memberships (e.g., dog,
    rather than animal)
  • else most-specific-level class memberships
  • else names of individuals
  • Properties of Ns (else, of individual Ns)
  • Structure of Ns (else )
  • Functions of Ns (else )
  • Acts that Ns perform (else )
  • Agents that perform acts w.r.t. Ns
  • the acts they perform (else)
  • Ownership
  • Synonyms
  • Else do syntactic/algebraic manipulation
  • Al broke a vase ? a vase is something Al broke
  • Or a vase is a breakable physical object

26
Verb Algorithm
  • Infer
  • properties of V
  • superclasses of V
  • transitivity information
  • similar actions ( delete dissimilar actions)
  • Conceptual-Dependency category
  • info about manner of V (from/to, transfer
    kind, instrument)
  • causes effects
  • Also return class membership of
  • agent
  • object
  • indirect object
  • instrument
  • Also preliminary work on adjective algorithm

27
Belief Revision
  • Used to revise definitions of words with
    different sense from current meaning hypothesis
  • SNeBR (ATMS Martins Shapiro 88)
  • If inference leads to a contradiction, then
  • SNeBR asks user to remove culprit(s)
  • automatically removes consequences inferred
    from culprit
  • SNePSwD (SNePS w/ Defaults Martins Cravo 91)
  • Previously used to automate step 1, above
  • Now, legacy code
  • AutoBR (Johnson Shapiro, in progress)
  • new default reasoner (Bhushan Shapiro)
  • Will replace SNePSwD

28
Revision Expansion
  • Removal revision being automated via SNePSwD by
    ranking all propositions with kn_cat
  • most intrinsic info re language fundamental
    background info
  • certain (before is transitive)
  • story info in text (King Lot rode
    to town)
  • life background info w/o variables or
    inference
  • (dogs are animals)
  • story-comp info inferred from text (King
    Lot is a king, rode on a horse)
  • life-rule.1 everyday commonsense
    background info
  • (BearsLiveYoung(x) ? Mammal(x))
  • life-rule.2 specialized background info
  • (x smites y ? x kills y by
    hitting y)
  • least
  • certain questionable already-revised
    life-rule.2 not part of input

29
Belief Revision smite
  • Misunderstood word 2-stage subtractive
    revision
  • Background knowledge includes
  • () smite(x,y,t) ? hit(x,y,t) dead(y,t)
    cause(hit(x,y,t),dead(y,t))
  • P1 King Lot smote down King Arthur
  • D1 If person x smites person y at time t, then x
    hits y at t, and y is dead at t
  • Q1 What properties does King Arthur have?
  • R1 King Arthur is dead.
  • P2 King Arthur drew Excalibur.
  • Q2 When did King Arthur do this?
  • SNeBR is invoked
  • KAs drawing E is inconsistent with being dead
  • () replaced smite(x,y,t) ? hit(x,y,t)
    ?dead(y,t) dead(y,t) ? cause(hit, dead)
  • D2 If person x smites person y at time t, then
    x hits y at t ?(y is dead at t)
  • P3 another passage in which (smiting ?
    death)
  • D3 If person x smites person y at time t, then
    x hits y at t

30
Belief Revision dress
  • additive revision
  • Background info includes
  • dresses(x,y) ? ?zclothing(z) wears(y,z)
  • Spears dont wear clothing (both
    kn_catlife.rule.1)
  • P1 King Arthur dressed himself.
  • D1 A person can dress itself result it wears
    clothing.
  • P2 King Claudius dressed his spear.
  • Cassie infers King Claudiuss spear wears
    clothing.
  • Q2 What wears clothing?
  • SNeBR is invoked
  • KCs spear wears clothing inconsistent with (2).
  • (1) replaced dresses(x,y) ? ?zclothing(z)
    wears(y,z) v NEWDEF
  • Replace (1), not (2), because of verb in
    antecedent of (1) (Gentner)
  • P3 other passages in which dressing spears
    precedes fighting
  • D2 A person can dress a spear or a person
  • result person wears clothing or person
    is enabled to fight

31
Ongoing ResearchFrom Algorithm to Curriculum
  • more robust algorithms
  • better N coverage needed
  • much better V coverage needed
  • no general adjective/adverb coverage yet
  • need internal context (morphology, etc.)
  • need NL interface
  • need acting component
  • need curriculum
  • CVA taught, but not well (emphasis on guessing)
  • we have explicit, teachable theory of how to do
    CVA
  • joint work w/ Michael Kibby, UB/LAI/Reading
    Clinic

32
State of the Art Vocabulary Learning
  • Some dubious contributions
  • Mueser 1984 Practicing Vocabulary in Context
  • BUT context definition !!
  • Clarke Nation 1980 a strategy (algorithm?)
  • Look at word context determine POS
  • Look at grammatical context
  • E.g., who does what to whom?
  • Look at wider context
  • E.g., search for Sternberg-like clues
  • Guess the word check your guess

33
CVA From Algorithm to Curriculum
  • guess the word
  • then a miracle occurs
  • Surely,
  • we computer scientists
  • can be more explicit!

34
Terminology Guessing?
  • Does reader
  • guess a meaning?!
  • not computational!
  • deduce a meaning?
  • too restrictive ignores other kinds of inference
  • infer a meaning?
  • too vague ignores other kinds of reasoning (cf.
    Herbert Simon)
  • figure out a meaning?
  • just vague enough?
  • My preference
  • The reader computes a meaning!

35
Terminology Co(n)text
  • co-text or textual context surrounding
    words
  • context or wide context
  • internalized co-text
  • integrated with
  • readers prior knowledge
  • internalized mental model of
  • involves local interpretation (cf. McKoon
    Ratcliff)
  • pronoun resolution, simple inferences (e.g.,
    proper names)
  • global interpretation (full use of available
    PK)
  • can involve misinterpretation (see later slide)
  • integrated via belief revision
  • new beliefs added by inference from text prior
    knowledge
  • old beliefs removed (usually from prior knowledge
    base)

36
Prior Knowledge
Text
PK1 PK2 PK3 PK4
37
Prior Knowledge
Text
T1
PK1 PK2 PK3 PK4
38
Integrated KB
Text
T1
internalization
PK1 PK2 PK3 PK4
I(T1)
39
B-R Integrated KB
Text
T1
internalization
PK1 PK2 PK3 PK4
I(T1)
inference
P5
40
B-R Integrated KB
Text
T1
internalization
PK1 PK2 PK3 PK4
I(T1)
T2
inference
P5
I(T2)
P6
41
B-R Integrated KB
Text
T1
internalization
PK1 PK2 PK3 PK4
I(T1)
T2
inference
T3
P5
I(T2)
P6
I(T3)
42
B-R Integrated KB
Text
T1
internalization
PK1 PK2 PK3 PK4
I(T1)
T2
inference
T3
P5
I(T2)
P6
I(T3)
43
Note All contextual reasoning is done in this
context
B-R Integrated KB
Text
T1
internalization
PK1 PK2 PK3 PK4
P7
I(T1)
T2
inference
T3
P5
I(T2)
P6
I(T3)
44
On Misinterpretation
  • Sign seen on truck parked outside of cafeteria at
    Student Union
  • Mills Wedding and Specialty Cakes

45
On Misinterpretation
  • Sign seen on truck parked outside of cafeteria at
    Student Union
  • Mills Welding and Specialty Gases

46
CVA as Science Detection
  • CVA hypothesis generation testing
  • scientific task
  • develop theory of word meaning
  • not guessing, but
  • In science, guessing is called hypothesis
    formation (Loui)
  • detective work
  • finding clues
  • not who done it, but what does it mean
  • susceptible to revision upon further evidence

47
2 Problematic Assumptions
  • CVA assumes that
  • reader is consciously aware of the unfamiliar
    word
  • reader notes its unfamiliarity
  • CVA assumes that, between encounters
  • reader remembers the word
  • reader remembers hypothesized meaning

48
I. Are All Contexts Created Equal?
  • Beck, McKeown, McCaslin (1983),
  • Vocabulary Development Not All Contexts Are
    Created Equal
  • Elementary School Journal 83(3) 177-181.
  • it is not true that every context is an
    appropriate or effective instructional means for
    vocabulary development

49
Role of Prior Knowledge
  • Beck et al
  • co-text can give clues to the words meaning
  • But clue is relative
  • clues need other info to be seen as clues
  • Implication A1
  • textual clues need to be supplemented with other
    information to compute a meaning.
  • Supplemental info readers prior knowledge
  • has to be accessible to reader
  • will be idiosyncratic
  • ?Co-text doesnt suffice prior knowledge needed

50
Do Words Have Unique, Correct Meanings?
  • Beck et al. ( others) assume
  • A2 A word has a unique meaning
  • A3 A word has a correct meaning
  • Contra unique A words meaning varies with
  • co-text
  • reader(s prior knowledge)
  • time of reading
  • Correct is a red herring (in any case, its
    fishy)
  • Possibly, words have author-intended meanings
  • but these need not be determined by context
    (textual or wide)
  • Misunderstandings are universally unavoidable
  • Perfect understanding/dictionary definition
    not needed
  • understanding for passage comprehension suffices
  • reader can always revise definition hypothesis

51
Beck et al.sCategories of Textual Contexts
  • What kinds of co-texts are helpful?
  • But keep in mind that we have different goals
  • Beck et al.
  • use co-text to teach correct word meanings
  • CCVA
  • use context to compute word meaning for
    understanding

52
Beck et al.s Textual Context Categories
Top-Level Kinds of Co-Text
  • Pedagogical co-texts
  • artificially constructed, designed for teaching
  • Natural co-texts
  • not intended to convey the meaning of a word
  • 4 kinds (actually, a continuum)

53
Beck et al.s Textual Context Categories
Top-Level Kinds of Co-Text
  • Pedagogical co-texts
  • artificially constructed, designed for teaching
  • only example is for a verb
  • All the students made very good grades on the
    tests, so their teacher commended them for doing
    so well.
  • Natural co-texts
  • not intended to convey the meaning of a word
  • 4 kinds (actually, a continuum)

54
Beck et al.s Textual Context Categories1.
Misdirective (Natural) Co-Texts
  • seem to direct student to incorrect meaning for
    a word
  • sole example
  • Sandra had won the dance contest and the
    audiences cheers brought her to the stage for an
    encore. Every step she takes is so perfect and
    graceful, Ginny said grudgingly, as she watched
    Sandra dance.
  • grudgingly ? admiringly
  • Is this a natural context?
  • Is this all there is to it?..
  • A4 Co-texts have a fixed, usually small size
  • But larger co-text might add information
  • Prior knowledge can widen the context
  • grudgingly is an adverb!
  • A5 All words are equally easy to learn
  • But N easier than V, V easier than Adj/Adv!
    (Granger/Gentner/..Gleitman..)
  • A6 Only 1 co-text can be used.
  • But later co-texts can assist in refining meaning

55
Beck et al.s Textual Context Categories2.
Nondirective (Natural) Co-Texts
  • of no assistance in directing the reader toward
    any particular meaning for a word
  • sole example is for an adjective
  • Dan heard the door open and wondered who had
    arrived. He couldnt make out the voices. Then
    he recognized the lumbering footsteps on the
    stairs and knew it was Aunt Grace.
  • But
  • Is it natural?
  • What about larger co-text?
  • An adjective!
  • Of no assistance? (see next slide)

56
Syntactic Manipulation
  • Misdirective nondirective contexts can yield
    correct information!
  • Cf. algebraic manipulation (brings x into focus)
  • 2x 1 7 /? x (7 - 1)/2 6/2 3
  • Syntactic manipulation (bring hard word into
    focus)
  • Every step she takes is so perfect and
    graceful, Ginny said grudgingly.
  • Grudgingly is the way that Ginny said
  • So, grudgingly is a way of saying something
  • In particular, grudgingly is a way of
    (apparently) praising someones performance
  • he recognized the lumbering footsteps on the
    stairs
  • lumbering is a property of footsteps on stairs
  • Generates initial hypothesis for later refinement

57
Beck et al.s Textual Context Categories3.
General (Natural) Co-Texts
  • provide enough information for reader to place
    word in a general category
  • sole example is for an adjective
  • Joe and Stan arrived at the party at 700. By
    930 the evening seemed to drag for Stan. But
    Joe really seemed to be having a good time at the
    party. I wish I could be as gregarious as he
    is, thought Stan
  • Same problems, but
  • adjective is contrasted with Stans attitude
  • contrasts are good (so are parallel
    constructions)

58
Beck et al.s Textual Context Categories4.
Directive (Natural) Co-Texts
  • seem likely to lead the student to a specific,
    correct meaning for a word
  • sole example is for a noun
  • When the cat pounced on the dog, he leapt up,
    yelping, and knocked over a shelf of books. The
    animals ran past Wendy, tripping her. She cried
    out and fell to the floor. As the noise and
    confusion mounted, Mother hollered upstairs,
    Whats all the commotion?
  • Natural? Long!
  • Noun!
  • note that the sole example of a directive context
    is a noun, suggesting that it might be the word
    that makes a context directive

59
Beck et al.s Experiment
  • Ss given passages from basal readers (reading
    textbooks)
  • Researchers categorized co-texts blacked out
    words
  • Ss asked to fill in the blanks with the missing
    words or reasonable synonyms
  • Results confirm 4 co-text types
  • Independently of results, there are
    methodological questions
  • Are basal readers natural contexts?
  • How large were co-texts?
  • Instruction on how to do CVA?
  • A7 CVA comes naturally, so needs no training
  • A8 Fill-in-the-blank tasks are a form of CVA
  • No, theyre not! (see next slide)

60
Beck et al.s Experiment CVA, Neologisms,
Fill-in-the-Blank
  • Serious methodological problem for all of us
  • Replacing word with blank or neologismmisleads
    Ss to find correct missing/hidden word
  • ? CVA!
  • Our (imperfect) solution
  • use plausible-sounding neologism
  • tell S its like a foreign word with no English
    equivalent, hence need a descriptive phrase

61
Beck et al.s Conclusion
  • less skilled readers receive little benefit
    from CVA
  • A9 CVA can only help in learning correct
    meanings.
  • But
  • CVA uses same techniques as general reading
    comprehension
  • careful, slow reading
  • careful analysis of text
  • directed search for information useful for
    computing a meaning
  • application of relevant prior knowledge
  • application of reasoning for purpose of
    extracting information from text
  • ?CVA, if properly taught practiced, can improve
    general reading comprehension

62
II. Are Context Clues Unreliable Predictors of
Word Meanings?
  • Schatz Baldwin (1986)
  • Context Clues Are Unreliable Predictorsof Word
    Meanings
  • Reading Research Quarterly 21(4) 439-453.
  • context does not usually provide clues to the
    meanings of low-frequency words
  • context clues inhibit the correct prediction of
    word meanings just as often as they facilitate
    them

63
SBs Argument
  • A10 CVA is not an efficient mechanism for
    inferring word meanings.
  • Because
  • Co-text cant help you figure out the correct
    meaning of an unfamiliar word.
  • (uniqueness correctness assumptions again!)
  • But, we argue
  • Wide context can help you figure out a meaning
    for an unfamiliar word.
  • So, context ( CVA) are efficient mechanisms for
    inferring (better computing) word meanings.

64
Incidental vs. Deliberate CVA
  • SB
  • context clues should help readers to infer
    meanings of words without the need for readers to
    interrupt the reading act() with diversions to
    external sources
  • () true for incidental CVA
  • () not for deliberate CVA
  • External sources are no solution anyway
  • Dictionary definitions are just more co-texts!
    (Schwartz 1988)
  • CVA is base case of recursion, one of whose
    recursive clauses is Look it up in a
    dictionary

65
  • Why not use a dictionary?
  • Because
  • People are lazy (!)
  • Dictionaries are not always available
  • Dictionaries are always incomplete
  • Dictionary definitions are not always useful
  • chaste df clean, spotless /? new dishes are
    chaste
  • college df a body of clergy living together
    and
  • supported by a foundation
  • Most words are learned via incidental CVA,
  • not via dictionaries
  • Most importantly
  • Dictionary definitions are just more contexts!

66
Why not use a dictionary?
  • Merriam-Webster New Collegiate Dictionary
  • chaste.
  • innocent of unlawful sexual intercourse
  • student stay away from that one!
  • celibate
  • student huh?
  • pure in thought and act modest
  • student I have to find a sentence for that?
  • a severely simple in design or execution
    austere
  • student huh? severely? austere?
  • b clean, spotless
  • student all right! The plates were still
    chaste after much use.
  • Deese 1967 / Miller 1985

67
Why not use a dictionary?
  • Merriam-Webster (continued)
  • college.
  • a body of clergy living together and supported by
    a foundation
  • a building used for an educational or religious
    purpose
  • a a self-governing constituent body of a
  • university offering living quarters and
  • instruction but not granting degrees
  • b a preparatory or high school
  • c an independent institution of higher
  • learning offering a course of general
  • studies leading to a bachelors
    degree
  • Problem ordering is historical!

68
Why not use a dictionary?
  • Merriam-Webster (continued)
  • infract infringe
  • infringe encroach
  • encroach
  • to enter by gradual steps or by stealth into the
    possessions or rights of another
  • to advance beyond the usual or proper limits
    trespass

69
Why not use a dictionary?
  • Collins COBUILD Dictionary
  • Helping Learners with Real English
  • chaste.
  • Someone who is chaste does not have sex with
    anyone, or only has sex with their husband or
    wife an old-fashioned use, used showing
    approval. EG She was a holy woman, innocent and
    chaste.
  • Something that is chaste is very simple in style,
    without much decoration. EG chaste houses built
    in 1732

70
Why not use a dictionary?
  • Collins COBUILD Dictionary
  • college.
  • A college is 1.1 an institution where students
    study for qualifications or do training courses
    after they have left school.
  • infract not in dictionary
  • infringe.
  • If you infringe a law or an agreement, you break
    it.
  • encroach.
  • To encroach on or upon something means to slowly
    take possession or control of it, so that someone
    else loses it bit by bit.

71
SBs Experiments
  • 25 natural passages from novels
  • words chosen (the only cited examples)
  • Adj/Adv 67
  • N 27
  • V 6
  • But
  • what are actual s?
  • which lexical categories were hardest?
  • how do facilitative/confounding co-texts
    correlate with lexical category?
  • should have had representative sample of 4
    co-text categories X 3 or 4 lexical categories

72
SBs ExperimentsCVA vs. Word-Sense
Disambiguation
  • 2 experiments
  • Ss chose correct meaning from list of 5
    possible meanings
  • This is WSD, not CVA!
  • WSD multiple choice
  • CVA essay question
  • 3rd experiment
  • real CVA, but interested only in full denotative
    meanings or accurate synonyms
  • cf. assumption A3 about correct meanings!

73
SBs Experiments Space Time Limits
  • Space limits on size of co-text?
  • SB 3 sentences
  • CCVA start small, work outward
  • Time limits on size of co-text?
  • SB all students finished in allotted time
  • CCVA no time limits

74
SBs Experiments Teaching CVA
  • SB did not control for Ss knowledge of how
    to use context clues
  • CCVA
  • deliberate CVA is a skill
  • needs to be taught, modeled, practiced
  • there is other (later) evidence that such
    training works
  • including critical thinking education

75
SBs 3 Questions(answered in the negative)
  • Do context clues occur with sufficient frequency
    to justify them as a major element of reading
    instruction?
  • Context clues do occur, teaching them is
    justified, if augmented by readers prior
    knowledge knowledge of CVA skills.
  • Does context usually provide accurate clues to
    denotations connotations of low-frequency
    words?
  • CVA can provide clues to revisable hypotheses
    about unfamiliar words meaning
  • Are difficult words in natural co-texts
    usually amenable to such analysis?
  • Such words are always amenable to yielding at
    least some information about their meaning.

76
Our CVA Theory
  • Every co-text can give some clue to a words
    meaning.
  • Co-text clues must be supplemented by readers
    prior knowledge.
  • Value of co-text depends on readers prior
    knowledge ability to integrate them.
  • CVA ? fill-in-the-blank CVA ? WSD
  • Co-text size has no arbitrary limits
  • May need lots of co-texts before CVA can
    asymptotically approach a stable meaning.

77
Our CVA Theory (continued)
  • A word does not have a unique meaning
  • A word does not have a correct meaning
  • A words correct (intended) meaning does not
    need to be known in order for reader to
    understand it in context
  • Even familiar/well-known words can acquire new
    meanings in new contexts.
  • Neologisms usually are learned from context.
  • Some words are easier to compute meanings for
    than others (N lt V lt Adj/Adv)
  • CVA is an efficient method for computing word
    meanings.
  • CVA can improve general reading comprehension

78
Our CVA Theory (continued)
  • CVA can (and should) be taught!
  • using a curriculum based on our algorithms ?

79
Teaching Computers vs. Teaching Humans
  • Our goal
  • Not teach people to think like computers
  • But to explicate computable teachable methods
    to hypothesize word meanings from context
  • AI as computational psychology
  • Devise computer programs that are essentially
    faithful simulations of human cognitive behavior
  • Can tell us something about human mind.
  • We are teaching a machine, to see if what we
    learn in teaching it can help us teach students
    better.
Write a Comment
User Comments (0)
About PowerShow.com