Final Review - PowerPoint PPT Presentation

1 / 15
About This Presentation
Title:

Final Review

Description:

Exercise. Construct the unigram and bigram models for the word 'mississippi' ... Exercise: Given P('a') = .35, P('b') = .10, P('bb') = .2, P('aa') = .15, P('ab' ... – PowerPoint PPT presentation

Number of Views:52
Avg rating:3.0/5.0
Slides: 16
Provided by: Joh7
Category:
Tags: ab | exercises | final | review

less

Transcript and Presenter's Notes

Title: Final Review


1
Final Review
  • April 22, 2004

2
Logistics
  • Monday, May 3rd, 200 p.m. 350 p.m.
  • Roberts 307
  • You may bring one sheet of notes
  • You may bring a calculator

3
13 Uncertainty
  • 0 lt P(a) lt 1
  • P(true) 1, P(false) 0
  • P(a?b) P(a) P(b) P(ab)
  • P(ab) P(ab)/P(b)
  • P(ab) P(ba) P(a) / P(b)
  • Exercise 13.11 (a)

4
14 Probabilistic Reasoning
  • Bayesian Nets
  • construction
  • inference
  • Exercise 14.1 (a)
  • Unigram/Bigram/Trigram Models
  • construction
  • use

5
15 Probabilistic Reasoning Over Time
  • First Order Hidden Markov Model
  • Bayesian Network Structure
  • Transition Model
  • Sensor Model
  • Filtering P(Xt e1t)
  • Prediction P(Xtke1t) k gt 0
  • Idea of Smoothing P(Xk e1t) 0 lt k lt t
  • Idea of Most Likely Explanation
  • argmax P(x1t e1t)

6
Exercise
  • W a day is a weekend (hidden)
  • C the campus is busy (observable)
  • Given P(W0) lt1, 0gt, C1 true
  • Calculate P(W1) Filtering
  • Calculate P(W2) Predicting

7
Exercise
  • Construct the unigram and bigram models for the
    word mississippi
  • Show how to use the models to make predictions

8
16 Simple Decisions
  • MEU Maximum Expected Utility
  • VPI Value of Perfect Information
  • Exercise 16.1 (a) and (b)
  • Exercse 16.2

9
17 Complex Decisions
  • Markov Decision Process
  • S0
  • T(s, a, s)
  • R(s)
  • Value Iteration Algorithm, Figure 17.4
  • Exercise Be able to trace the value iteration
    algorithm

10
18 Learning From Observation
  • AdaBoost
  • PAC-Learning
  • N gt 1/? (ln 1/? ln H)
  • error lt ?
  • probability (error lt ?) gt 1 - ?

11
19 Knowledge in Learning
  • Section 19.3, Explanation Based Learning

12
20 Statistical Learning
  • Naïve Bayes
  • P(C x1, , xn) P(x1 xn C) P(C) ?
    P(x1C) P(xn C) P(C) ?
  • K-Nearest Neighbors
  • Backpropagation Neural Networks

13
22 Communication
  • Parsing
  • top down
  • bottom up
  • idea of chart parsing (be able to draw the final
    picture)
  • SEQUITUR grammar inducer
  • Exercises 22.3, 22.9

14
23 Probabilistic Language Processing
  • Viterbi-Based Word Segmentation Algorithm, Figure
    23.1
  • Be able to trace
  • Exercise Given P(a) .35, P(b) .10,
    P(bb) .2, P(aa) .15, P(ab) .15,
    P(bb) .05, how would you interpret abbaa?
  • Exercise 23.1

15
Chapter 23
  • PCFG
  • Constructing
  • Using
  • Bayes IR Model
Write a Comment
User Comments (0)
About PowerShow.com