Review of Bayesian Methods - PowerPoint PPT Presentation

1 / 39
About This Presentation
Title:

Review of Bayesian Methods

Description:

Single parameter model: example: Beta-binomial, Normal with known var, Normal with known mean ... non-informative improper prior when a=0, =0 (s2) -1. Inverse ... – PowerPoint PPT presentation

Number of Views:84
Avg rating:3.0/5.0
Slides: 40
Provided by: astorS
Category:

less

Transcript and Presenter's Notes

Title: Review of Bayesian Methods


1
Review of Bayesian Methods
2
Looking back
  • Course description
  • Illustrates fundamentals and current approaches
    to Bayesian modeling and computation. Describes
    Bayesian approach to simple models, such as
    normal and binomial distributions. Introduce
    concepts such as conjugate and noninformative
    prior distributions. Use real data examples to
    illustrate tools including hierarchical models
    (random effect models), hypothesis testing, model
    averaging, linear regression, generalized linear
    models. Discusses modern Bayesian computation 
    the implementation and monitoring of Markov chain
    Monte Carlo methods (Gibbs' sampler and
    Metropolis Hastings algorithm).

3
Looking back
  • Course learning objective
  • Upon successfully completing this course,
    students will be able to
  • 1) develop an understanding and appreciation of
    the Bayesian approach
  • 2) specify models and choose priors to adequately
    address a problem
  • 3) make posterior inference both algebraically
    and computationally.

4
Outline
  • Fundamentals know what you are doing
  • Single parameter model example Beta-binomial,
    Normal with known var, Normal with known mean
  • Multi-parameter model inference strategy,
    example Normal model with unknown mean unknown
    variance

5
Fundamentals
6
Bayesian statistics
  • The quantification of uncertainty de Finetti
  • Can be equated with personal belief
  • the exploration of a parameter ? in light of
    data X --Dennis Lindley

7
  • Goal draw conclusion/inference for ?
  • Data is given (fixed in operational sense)
  • Parameter has uncertainty (random)

8
Bayesian Machinery
  • data
  • prior belief ------------? posterior belief
  • likelihood
  • prior distribution ------------? posterior
  • L(?)P(X?)
  • p(?) --------------------? p(?X)

9
Bayesian Machinery
10
Marginalization
  • Multi-parameter ?2 is nuisance parameter
  • Hierarchical ?2 is hyperparameter

11
Hypothesis Testing
  • Theories H1, H2, or more
  • Assign prior probabilities to each
  • Bayes factor measures the evidence for or against
    a theory

12
Binomial Models
  • Conjugate prior Multinomial

13
Beta distribution
  • A convenient distribution to use for probability
  • Beta(a, ß)
  • mean a/(aß)
  • var mean(1-mean)/(1aß)

14
Binomial with conjugate prior
  • ? beta(a,ß)
  • data y1,n1-y1
  • ?y1 beta(ay1,ßn1-y1)
  • data y2,n2-y2
  • ?y1,y2 beta(ay1y2, ßn1n2-y1-y2)
  • Informative conjugate priors has a pseudo-data
    interpretation

15
Multinomial with conjugate prior
  • ? Dirichlet(a1,a2,a3,)
  • data y11,y12,y13,
  • ?y1.Dirichlet(a1y11,a2y12,a3y13,)
  • data y21,y22,y23,
  • ?y1.,y2.Dirichlet(a1y11y21, a2y12y22,
  • a3y13y23, )
  • Informative conjugate priors has a pseudo-data
    interpretation

16
Normal Models
17
Normal model with known variance
  • Parameter of interest Normal mean ?
  • Normal prior N(µ0,t02), flat when t028
  • Normal posterior

18
Informative Prior for ?
19
Noninformative Prior for ?
  • ? p(?)? 1 flat prior

sample mean, standard error
20
Normal variance with known mean
  • Parameter of interest Normal variance s2
  • Inverse-Gamma prior invGamma(a, ß),
  • non-informative improper prior when a0, ß0 ?
    (s2) -1
  • Inverse-Gamma posterior
  • s2y invGamma(an/2, ß
    )

21
Informative Prior for s2
  • fGamma(a,ß) p(f)? ?a-1e-ß?
  • Let s21/f, then s2invGamma(a,ß)
  • p(s2)? (s2) -(a1)e-ß/s2


pseudo data interpretation of a ß
22
Noninformative prior for s2
23
Normal both mean variance unknown
  • Multi-parameter

24
Joint, Marginal, Conditional Posterior
  • P(?1y)?p(?1,?2y)d?2
  • P(?2y)?p(?1,?2y)d?1
  • P(?1,?2y)p(?1?2,y)p(?2y)
  • p(?2?1,y)p(?1y)

25
Strategy for Multi-Parameter Inference
  • Specify a joint prior p(?1, ?2)
  • Write down likelihood
  • L(?1, ?2)P(y?1, ?2)
  • Derive the joint posterior
  • P(?1, ?2 y)? p(?1, ?2) p(y?1, ?2)
  • Then make joint or marginal inference

26
Strategy for Multi-Parameter Inference for Normal
Data
  • Joint prior
  • p(µ, s2) p(µs2)p(s2)
  • Normal-InvGamma. or Uniform x 1/s2
  • Likelihood
  • L(µ, s2)P(yµ, s2)
  • Joint posterior
  • P(µ, s2 y)? p(µ, s2) p(yµ, s2)
  • Marginal posterior
  • P(µy)?p(µ, s2y)ds2 is a student-t
    (trick change of variable)
  • P(s2y)?p(µ, s2y)dµ is an inv-Gamma

27
Ordinary Linear Regression
  • A multi-variate case of Normal model.
  • Likelihood

28
Non-informative Joint Prior
  • Non-informative prior

29
Joint Posterior
30
Marginal, Conditional Posterior
Marginal for s2
Conditional for ß
Marginal for ß
31
Hierarchical Model
32
Rat tumor
hyper-prior
  • Rat tumor experiments, j1,2,,71

hyper-parameters
super-population prior
parameters
likelihood
33
Meta-analysis of beta-blocker
hyper-prior
  • Beta-blocker trials, j1,2,,22

?0, t
hyper-parameters
super-population prior
?22
parameters
likelihood
y22
34
Strategy for Hierarchical Model
  • Same as multi-parameter model
  • Joint prior ? Joint posterior
  • Joint posterior ? Marginal posterior

hyper-prior
super-popn
35
The school example
p(µ,t)? 1
µ overall trtmt effect t heterogeneity among
schools
? j N( µ, t )
? j effect at school j
yj N( ?j , sj2), sj2 known
36
Empirical Bayes is not hierarchical
µµ,t t
µµ,tt
µ overall trtmt effect t heterogeneity among
schools
?2
?3
?8
?1
? j N( µ, t )
? j effect at school j
y2
y3
y8
y1
yj N( ?j , sj2), sj2 known
µargmaxµp(µy)
37
Hierarchical Linear Modelwang
likelihood
super-popn
hyperprior
38
Noninformative Priors
  • A catalogue

39
(No Transcript)
40
Parameter Transformations
  • Jacobian
Write a Comment
User Comments (0)
About PowerShow.com