Economics 105: Statistics - PowerPoint PPT Presentation

1 / 17
About This Presentation
Title:

Economics 105: Statistics

Description:

Estimates are sensitive to changes in specification (i.e., which ... Hallmark is high R2 but insignificant t-statistics. Remedy. Do nothing. Drop a variable ... – PowerPoint PPT presentation

Number of Views:72
Avg rating:3.0/5.0
Slides: 18
Provided by: markc7
Learn more at: http://www.davidson.edu
Category:

less

Transcript and Presenter's Notes

Title: Economics 105: Statistics


1
Economics 105 Statistics
  • Any questions?

2
Multicollinearity
  • Multicollinearity typically refers to severe,
    but imperfect multicollinearity
  • Matter of degree, not existence
  • Consequences
  • Estimates of the coefficients are still unbiased
  • Std errors of these estimates are increased
  • t-statistics are smaller
  • Estimates are sensitive to changes in
    specification (i.e., which variables are included
    in the model)
  • R2 largely unaffected

3
Multicollinearity
  • Detection
  • calculate all the pairwise correlation
    coefficients
  • .7 or .8 is some cause for concern
  • VIFs can also be calculated
  • Hallmark is high R2 but insignificant
    t-statistics
  • Remedy
  • Do nothing
  • Drop a variable
  • Transform multicollinear variables
  • need to have same sign and magnitudes
  • Get more data (i.e., increase the sample size)

4
Causes of Heteroskedasticity
  • Violation of (3)
  • Causes of heteroskedasticity
  • Learning
  • Discretion
  • Outliers
  • Model misspecification
  • should include quadratic
  • incorrect functional form (ln-ln, ln-linear)
  • Skewed explanatory variables
  • income, wage, education

5
Causes of Heteroskedasticity
  • Violation of (3)
  • Causes of heteroskedasticity
  • Learning

6
Causes of Heteroskedasticity
  • Violation of (3)
  • Causes of heteroskedasticity
  • Discretion

7
Causes of Heteroskedasticity
  • Violation of (3)
  • Causes of heteroskedasticity
  • Learning
  • Discretion
  • Outliers
  • Model misspecification
  • should include quadratic
  • incorrect functional form (ln-ln, ln-linear)
  • Skewed explanatory variables
  • income, wage, education

8
Consequences of Heteroskedasticity
  • Unbiased estimators are produced when OLS is used
    when the errors are heteroskedastic
  • However, the standard errors are incorrect and
    OLS is no longer BLUE
  • Any hypothesis tests conducted could yield
    erroneous results

9
Weighted Least Squares
  • If OLS estimators are not BLUE in the presence of
    heteroskedasticity, what are the best estimators?
  • Can weight the observations so that more weight
    is put on observations associated with levels of
    X having a smaller error variance
  • Transform the model so that the errors no longer
    exhibit heteroskedasticity
  • The basic model with heteroskedasticity is
  • Yi ?0 ?1Xi ?i
  • Var(?i) ?i2

10
Weighted Least Squares
  • Dividing each observation by the associated
    standard deviation of the error transforms the
    model
  • OLS estimators for this model are BLUE in the
    presence of heteroskedasticity

11
Weighted Least Squares
  • The error term in the transformed model is no
    longer heteroskedastic
  • The transformed model is called a weighted least
    squares
  • Each observation is now weighted by the inverse
    of the standard deviation of the error
  • Major difficulty in estimating weighted least
    squares
  • Dont observe
  • So we assume and transform
    model

12
Detection of Heteroskedasticity
  • Most tests to detect heteroskedasticity share a
    similar approach
  • Look for some association between some function
    of the errors and some function of the
    explanatory variable(s)
  • OLS estimates are constructed such that there
    will be no correlation between the errors and the
    explanatory variablessuggesting we work with
    some function of the errors
  • One approach is to regress ei2 on the
    explanatory variables and all combinations of the
    explanatory variables
  • Looking to see if larger values of the error are
    associated with some function of the explanatory
    variables

13
Detection of Heteroskedasticity
  • Informal test Graph the residuals
  • ei versus predicted values of Y (Y-hat)
  • ei versus each of the Xs (one at a time)
  • look for patterns
  • Formal tests
  • White
  • Breusch-Pagan
  • Goldfeld-Quandt
  • Park

14
Breusch-Pagan test
  • Estimate the model by OLS
  • Obtain the squared residuals,
  • Run
  • keeping the R2 from this regression
  • 4. Do the whole model F-test, rejection indicates
    heteroskedasticity. Assumes
  • Breusch, T.S. and A.R. Pagan (1979), A Simple
    Test for Heteroskedasticity and Random
    Coefficient Variation, Econometrica 50, pp. 987
    - 1000.

15
Breusch-Pagan test (not needing )
  • Estimate the model by OLS
  • Obtain the squared residuals,
  • Run
  • keeping the R2 from this regression, call it
  • Test statistic
  • Rejection indicates heteroskedasticity. .

16
White test
  • Estimate the model by OLS
  • Obtain the squared residuals,
  • Run
  • keeping the R2 from this regression
  • 4. Do the whole model F-test, rejection indicates
    heteroskedasticity
  • White, H. (1980), A Heteroskedasticity-Consisten
    t Covariance Matrix Estimator and a Direct Test
    for Heteroskedasticity, Econometrica 48, pp. 817
    - 838.

17
White test
  • Adds squares cross products of all Xs
  • Advantages
  • no assumptions about the nature of the het.
  • Disadvantages
  • Rejection (a statistically significant White
    test statistic) may be caused by het or it may be
    due to specification error its a nonconclusive
    test
  • Number of covariates rises quickly
  • so could also run
  • since the predicted values are functions of
    the Xs (and the estimated parameters) and do
    F-test
Write a Comment
User Comments (0)
About PowerShow.com