Hypothesis Testing in the Linear Regression Model - PowerPoint PPT Presentation

1 / 16
About This Presentation
Title:

Hypothesis Testing in the Linear Regression Model

Description:

We are interested in using the linear regression to establish or cast doubt on ... The model is used to test hypotheses about the underlying data generating process. ... – PowerPoint PPT presentation

Number of Views:579
Avg rating:3.0/5.0
Slides: 17
Provided by: valued51
Category:

less

Transcript and Presenter's Notes

Title: Hypothesis Testing in the Linear Regression Model


1
Hypothesis Testing in the Linear Regression Model
  • Based on Greenes Note 8

2
Classical Hypothesis Testing
  • We are interested in using the linear
    regression to establish or cast doubt on the
    validity of a theory about the real world
    counterpart to our statistical model. The model
    is used to test hypotheses about the underlying
    data generating process.

3
Inference in the Linear Model
  • Hypothesis testing
  • Formulating hypotheses linear restrictions as a
    general framework
  • Substantive restrictions What is a "testable
    hypothesis?"
  • Nested vs. nonnested models
  • Methodological issues
  • Classical (likelihood based approach) Are the
    data consistent with the hypothesis?
  • Bayesian approach How do the data affect our
    prior odds? ? The posterior odds ratio.

4
Testing a Hypothesis About a Parameter
Confidence Interval
  • bk the point estimate
  • Std.Devbk sqrs2(XX)-1kk vk
  • Assume normality of e for now
  • bk Nßk,vk2 for the true ßk.
  • (bk-ßk)/vk N0,1
  • Consider a range of plausible values of ßk given
    the point estimate bk. bk /- sampling error.
  • Measured in standard error units,
  • (bk ßk)/ vk lt z
  • Larger z ? greater probability (confidence)
  • Given normality, e.g., z 1.96 ? 95,
    z1.645?90
  • Plausible range for ßk then is bk z vk

5
Estimating the Confidence Interval
  • Assume normality of e for now
  • bk Nßk,vk2 for the true ßk.
  • (bk-ßk)/vk N0,1
  • vk s2(XX)-1kk is not known because s2 must
    be estimated.
  • Using s2 instead of s2, (bk-ßk)/est.(vk)
    tn-K.
  • (Proof ratio of normal to sqr(chi-squared)/df is
    pursued in your text.)
  • Use critical values from t distribution instead
    of standard normal.

6
Testing a Hypothesis using a Confidence Interval
  • Given the range of plausible values
  • The confidence interval approach.
  • Testing the hypothesis that a coefficient equals
    zero or some other particular value
  • Is the hypothesized value in the confidence
    interval?
  • Is the hypothesized value within the range of
    plausible values

7
Wald Distance Measure
  • Testing more generally about a single parameter.
  • Sample estimate is bk
  • Hypothesized value is ßk
  • How far is ßk from bk? If too far, the
    hypothesis is inconsistent with the sample
    evidence.
  • Measure distance in standard error units
  • t (bk - ßk)/Estimated vk.
  • If t is large (larger than critical value),
    reject the hypothesis.

8
The Wald Statistic
9
Robust Tests
  • The Wald test generally will (when properly
    constructed) be more robust to failures of the
    narrow model assumptions than the t or F
  • Reason Based on robust variance estimators
    and asymptotic results that hold in a wide range
    of circumstances.
  • Analysis Later in the course after developing
    asymptotics.

10
The General Linear Hypothesis H0 R? - q 0
  • A unifying departure point Regardless of the
    hypothesis, least squares is unbiased.
  • Two approaches
  • (1) Is Rb - q close to 0? Basing the test on
    the discrepancy vector m Rb - q. Using the
  • Wald criterion m?(Varm)-1m
  • has a chi-squared distribution with J
    degrees of freedom
  • But, Varm R?2(XX)-1R?.
  • If we use our estimate of ?2, we get an
    FJ,n-K, instead.
  • (Note, this is based on using e?e/(n-K) to
    estimate ?2.)
  • (2) We know that imposing the restrictions leads
    to a loss of fit. R2 must go down. Does it go
    down a lot? (I.e., significantly?).
  • R2 unrestricted model, R2 restricted
    model fit.
  • F (R2 - R2)/J / (1 - R2)/(n-K)
    FJ,n-K.
  • These are the same in the linear model

11
t and F statistics
  • An important relationship between t and F
  • For a single restriction, m rb - q. The
    variance is r(Varb)r
  • The distance measure is m / standard error of m.
  • The t-ratio is the square root of the F ratio.

12
Lagrange Multiplier Statistics
  • Specific to the classical model Recall the
    Lagrange multipliers
  • ? R(X?X)-1R?-1m.
  • Suppose we just test H0 ? 0, using the Wald
    criterion. The resulting test statistic is just
    JF where F is the F statistic above. This is to
    be taken as a chi-squared statistic. (Note,
    again, using e?e/(n-K) to estimate ?2. If e?e/n,
    instead, the more formal, likelihood based
    statistic results.)

13
Example
  • Time series regression,
  • LogG ?1 ?2logY ?3logPG
  • ?4logPNC ?5logPUC ?6Year
    ?
  • Period 1953 - 2004. A significant event
    occurs in October 1973. We will be interested to
    know if the model 1953 to 1973 is the same as
    from 1974 to 2004. Note that all coefficients in
    the model are elasticities.

14
Example
  • Simple Hypothesis
  • Test about one ParameterH0 ?5 0
  • Joint Hypotheses
  • H0 ?4 ?5 0Using the Wald Statistic
  • Chow Test Test for Structure Break
  • Structural break at the end of 1973
  • J Test

15
Algebra for the Chow Test
16
J Test
  • Two Competitors
  • Model XLogG ?1 ?2logY ?3logPG
    ?4logPNC ?5logPUC ?6Year ?
  • Model ZLogG ?1 ?2logY ?3logPG ?4logPN
    ?5logPD ?6PS ?7Year ?
  • Note, each has its own set of regressors. Can't
    obtain either as a restriction on the other.
  • Strategy See if the fitted values from the
    alternative model have any explanatory power in
    the null model. If so, reject the null.
Write a Comment
User Comments (0)
About PowerShow.com