Logic, Language and Learning - PowerPoint PPT Presentation

1 / 13
About This Presentation
Title:

Logic, Language and Learning

Description:

Translation. Disambiguation. Probabilistic/learning approaches. Many other points ... German. Nominativ, Dativ, Akkusativ. Semantics. Mapping Sentence - Logical Form ... – PowerPoint PPT presentation

Number of Views:33
Avg rating:3.0/5.0
Slides: 14
Provided by: profdrlu
Category:

less

Transcript and Presenter's Notes

Title: Logic, Language and Learning


1
Logic, Language and Learning
  • Chapter 12
  • Logic and Language
  • Wrapping up
  • Luc De Raedt

2
Language
  • What we have not done so far
  • Linguistics aspects have been simplified
  • Semantics / Reasoning
  • Translation
  • Disambiguation
  • Probabilistic/learning approaches
  • ...
  • Many other points

3
Linguistic issues
  • Developing proper NLP systems requires much more
    complicated grammars to model all aspects of
    Language
  • An overview is given in Chapter 4 of Covington
  • Part of Speech Tagging
  • Grammatical Relations
  • Cases
  • Etc.

4
Part of speech tagging
  • What ?
  • Given a sentence
  • Find tags (word cat) of all words
  • Approach often statistical/probabilistic

5
Grammatical Relations
  • Need to analyze relationships among constituents
  • Grammatical relations
  • Table 4.2.

6
Cases
  • Cases
  • Cf. German
  • Nominativ, Dativ, Akkusativ

7
Semantics
  • Mapping Sentence -gt Logical Form
  • Some (very simple) illustrations from Flachs
    book
  • Use of lambda abstraction
  • Tables 7.2,3

8
Semantics
  • Use Logical Form for Reasoning
  • Assumes domain/world/background knowledge exist
    in the knowledge base
  • NL understanding
  • requires a lot of domain knowledge
  • requires many artificial intelligence components
  • Planning
  • Learning
  • Reasoning
  • Knowledge representation
  • Etc.
  • Often the bottleneck, cause of brittleness of NLP
    system
  • Dependent on goal of NLP system
  • Information retrieval
  • Translation
  • Dialogues

9
Translation
  • Map sentence in language1 to sentence in
    language2
  • Two approaches
  • Direct mapping
  • Map to interlingua (e.g. logical form)
  • Advantage of Prolog/UBG for interlingua
  • Grammars are bi-directional
  • s(What,every,dog,barked)
  • What(all(X,dog(X),barked(X))
  • s((all(X,dog(X),barked(X)),S)
  • S every,dog,barked
  • Use two bi-directional grammars with same Logical
    Form
  • Table 8.1

10
Disambiguation
  • One of the key problems in NLP
  • Structural disambiguation
  • Empirical approach
  • Learn probabilistic CFG grammars
  • VP -gt V NP PP (0.3)
  • VP -gt V NP (0.4)
  • VP -gt V (0.3)
  • Focus on most probable parses
  • Word sense disambiguation
  • Fig 4.2

11
Word Sense Disambiguation
  • What ?
  • Words do not always have a unique meaning.
  • Eg. Bachelor 58, run 54, go 38
  • Three types
  • Homonymy
  • Two words that have no relation but that sound
    alike
  • The bark of a tree versus the bark of a dog
  • Polysemy
  • Different senses but connected
  • The kernel of an OS versus The kernel of a nut
  • Argument ambiguity
  • Mary is cooking (Subjectagent)
  • The potatoes are cooking (Subjecttheme)

12
Empirical NLP
  • Combining Machine Learning with NLP
  • The linguist (Noam Chomsky 1969)
  • It must be recognized that the notion of
    probability of a sentence is an entirely
    useless one, under any known interpretation of
    this term
  • The statistician (Fred Jelenek 1988, head of IBM
    speech group)
  • Anytime a linguist leaves the group the
    recognition rate goes up

13
Empirical NLP
  • Since the mid-nineties
  • Empirical NLP has become a major trend
  • Shallow parsing versus deep parsing
  • Importance of Corpora
  • E.g. the Wall Street Journal Corpus
  • Etc.
Write a Comment
User Comments (0)
About PowerShow.com