Bayesian models of cross-situational word learning - PowerPoint PPT Presentation

1 / 25
About This Presentation
Title:

Bayesian models of cross-situational word learning

Description:

pretty. objects. O. I. intention. ball. lexicon. ball. bike. Inference. Bayes' rule ... baby. Object. Word. Most likely intentions. Best lexicon found by search ... – PowerPoint PPT presentation

Number of Views:26
Avg rating:3.0/5.0
Slides: 26
Provided by: michae1296
Category:

less

Transcript and Presenter's Notes

Title: Bayesian models of cross-situational word learning


1
Bayesian models of cross-situational word
learning
Michael C. Frank Noah Goodman Josh Tenenbaum (MIT)
Thanks to Kathy Hirsh-Pasek and Roberta Golinkoff
for valuable discussion. Also thanks to Vikash
Mansinghka, Ted Gibson, tedlab, and cocosci for
comments and the Jacob Javits Foundation for
funding.
2
Word-learning in action
3
The problem of word learning
In any one situation, children hear many words
and see many objects
4
One possible solution
Apply a cross-situational strategy to learn
mappings (but this is harder than it looks)
5
The problem of word learning
  • Techniques for cross-situational word learning
  • Deductive inference Siskind (1996)
  • Translation model Yu, Ballard, Aslin (2005),
    Yu Ballard (in press)

6
Outline
  • Some facts of word learning
  • Mutual exclusivity
  • Fast-mapping
  • Use of social cues
  • Our model Bayesian word-learner
  • Extension Learning social cues
  • Experimental coverage
  • Some facts of word learning
  • Our model Bayesian word-learner
  • Extension Learning social cues
  • Experimental coverage

7
Three facts of word learning
8
Outline
  • The facts of word learning
  • Our model Bayesian word-learner
  • Model
  • Corpus
  • Comparison models
  • Results
  • Extension Learning social cues
  • Experimental coverage

9
Generative model
objects
O
lexicon
things you intend to refer to
I
W
words
? situations
10
Generative model example
? situations
11
Inference
Bayes rule
Parsimony prior on lexicons
  • Inference technique
  • Stochastic search with simulated tempering
  • Data-driven proposals drawn from the mutual
    information of word-object pairings

12
Corpus
  • 2x10 min clips from CHILDES-Rollins
  • Interaction between mom and infant (6mo)
  • 2528 word tokens of 420 words in 623 sentences
  • 24 objects, all toys

13
Model comparison
  • Co-occurrence frequency
  • Point-wise mutual information
  • Translation model, based on IBM model 1 (Yu
    Ballard, in press)

14
Results model comparison
recall
precision
15
Results intuitive analysis
Best lexicon found by search
Most likely intentions
Word Object
baby book
bigbird bird
bird rattle
birdie duck
book book
oink pig
hand hand
hat hat
meow kitty
moocow cow
oink pig
on ring
ring ring
sheep sheep
Also unlike baseline models, our model is
extremely extensible
16
Outline
  • The facts of word learning
  • Our model Bayesian word-learner
  • Extension Learning social cues
  • Corpus
  • Model
  • Preliminary results
  • Experimental coverage

17
Social corpus coding
  • Coded social cues for each utterance infants
    hands, eyes, mouth, and touch moms hands, eyes,
    and touch

18
How it works
Im looking Mom looking
Ball 0 1
Bike 1 0

Bag 0 0
could be caused by base rate or by relevance
Noisy OR process
base rate
relevance
19
Social model framework
objects
O
lexicon
relevance and base rate of social cues
things you intend to refer to
r,b
I
S
W
social cues
words
? situations
20
Preliminary Results
Model finds appropriate features
Social features allow finding intent in
situations without referential words
21
Outline
  • The facts of word learning
  • Our model Bayesian word-learner
  • Extension Learning social cues
  • Experimental coverage
  • Mutual exclusivity
  • Fast-mapping
  • Use of social cues

22
Mutual exclusivity
23
Fast-mapping
model can fast-map learn a word from a single
instance
24
Use of social cues
25
Conclusions
  • Bayesian model of cross-situational word-learning
  • Performed best over a corpus
  • Allows parsing of sentences and interpretation of
    speakers intent
  • Social model
  • Model can learn which social cues are relevant to
    reference
  • Experimental coverage
  • Mutual exclusivity
  • Fast-mapping
  • Learning words for social cues
Write a Comment
User Comments (0)
About PowerShow.com