Perfect multicollinearity (???????) - PowerPoint PPT Presentation

About This Presentation
Title:

Perfect multicollinearity (???????)

Description:

Title: Author: esc Last modified by: Jack Created Date: 7/16/2002 2:03:33 PM Document presentation format – PowerPoint PPT presentation

Number of Views:131
Avg rating:3.0/5.0
Slides: 53
Provided by: ESC8177
Category:

less

Transcript and Presenter's Notes

Title: Perfect multicollinearity (???????)


1
?? ????????????
  • ???
  • ??? ???
  • ??????????

2
????
  • ??????????????
  • ?????????
  • ?????????(??)

3
??????????????
  • 1.???????
  • 2.??????????
  • 3.??????????
  • 4.????

4
1.???????

5
1.???????
  • Perfect multicollinearity (???????)

6
????????
  • ???????????????????
  • ???????????
  • ????????????

7
????????
  • ???,??????????
  • ?????????
  • ????147??(11.2)?????

8
??????(Variance-Inflating Factor, VIF)
9
??????????????
10
2.??????????(p.149)
  • (1) High R2 but few significant t ratios.
  • (2) High pair-wise correlations among regressors.
  • (3) Examination of partial correlations.
  • (4) Auxiliary regressions.
  • (5) Eigenvalue and condition index.
  • (6) Tolerance and variance inflation factor.

11
(1) High R2 but few significant t ratios
  • ???????R2?,?????t????????
  • ?????????,??????????????

12
(2)????????
  • ??????????,??????????,??????0.8?????
  • (Gujarati 1995, 335)
  • ??,????????????????
  • ????????,????????
  • ????????,????????
  • ???????????????

13
(3)?????
  • ???????????????
  • ???????????

14
(4) ??????
  • ???????????
  • ?????????,?????????
  • ????R2
  • ????R2??????R2?F??
  • ??
  • ???R2??????????R2

15
(5) ????????
  • ??????10?????
  • ??10?30???????????
  • ??30????????

16
(6)??????????
  • VIF??10??????0.949
  • TOL1/VIF
  • ??1???????????
  • ??0????????????

17
3.??????????
  • (1) Large Variances of OLS Estimators
  • (2) Wider Confidence Intervals
  • (3) Insignificant t Ratio
  • (4) A High R2 but Few Significant t Ratio
  • (5) Sensitivity of OLS estimators and their
    standard errors to small changes in data

18
4.????
  • (1). A priori information
  • (2). Combining cross-sectional and time series
    data
  • (3). Dropping a variable(s) and specification
    bias
  • (4). Transformation of variables
  • (5). Additional or new data.
  • (6). Reducing collinearity in polynomial
    regressions
  • (7). Other methodsFactor analysis or principal
    components

19
(1)????
  • ?????????????
  • ?????????
  • ??????
  • ??????????????
  • (Gujarati 1995, 340)

20
(2)?????????
  • ??????????
  • ??,????????
  • (??????????????????)

21
(3)??????
  • ???????????????
  • ??,??????????
  • (Model Under-specification)

22
(4)????
  • ????????
  • ????????????????
  • ????????????,??,?????
  • ?????

23
(5)??????
  • ???2??,??,????????

24
(6)polynomial regressions
  • ?????????
  • ???????

25
(7)????
  • ??????????,????????
  • ?????????,???????????

26
?????????
  • 1.??
  • 2.??
  • 3.??????
  • 4.????
  • 5.????

27
1.??
  • ????????????????????

28
1.????????
29
1.?????????
30
2.??????????
  • As people learn
  • The variances of error terms are positively
    correlated with the independent variables.
  • When our data collecting techniques improve
  • Heteroskedasticity can also arise as a result of
    outliers in our data.
  • Another resource of heteroscedasticity arises
    from mis-specifying the regression model.

31
(1)????,????
32
(2) ??????????
  • ??????,???????
  • ??????,???????

33
(3)?????????
34
(4)???????
35
(5)??????
  • ??????

36
3.??????
  • 1). Heteroskedasticity, Unbiasedness, and
    Consistency
  • 2). Heteroskedasticity and Standard Errors of OLS
    Coefficients
  •  The consequences of heteroskedasticity are that
    b is still unbiased and consistent, but its
    variance will be incorrect. In other words, it is
    inefficient.
  • Additionally, our conventional test statistics
    are invalid.

37
4.????
  • ???????
  • 1) The White Test
  • 2) The Breusch-Pagan-Godfrey Test

38
????????????
39
???????????
40
????The White Test
  • ??????
  • ???????
  • ??????
  • ???
  • ??????
  • ??????????
  • (Gujarati 1995,379-80)
  • ????nR2X2df

41
5.????
  • ????????????????????,???????????????????????,???
    ???????????????????,????????,?????????

42
?????????(??)
  • 1.??
  • 2.??
  • 3.??????
  • 4.????
  • 5.????

43
???????(??)
44
1.??
  • The term autocorrelation (or serial
    correlation) may be defined as correlation
    between members of series of observations ordered
    in time as in time series data or space as in
    cross-sectional data (Gujarati 1995 400-1).

45
2.??
  • However, serial correlation could indeed be error
    autocorrelation, but it could also be the result
    of dynamic misspecification, parameter
    nonconstancy, incorrect functional form, and so
    on (Granato 1991 124).

46
2.??
  • According to Kennedy (1992 119), there are
    several reasons why serial correlation arises
  • Spatial autocorrelation
  • Prolonged influence of shocks
  • Inertia
  • Data manipulation
  • Misspecification

47
3.??????
  • (1).No Lagged Endogenous Variable
  • (2).Lagged Endogenous Variable

48
(1).No Lagged Endogenous Variable
  • ?????????
  • ???????????
  • ????????????????

49
(2).Lagged Endogenous Variable
  • ?OLS?????Inconsistent

50
4.????
  • ????????????

51
5.????
  • ?????

52
?7???
  • ?????
Write a Comment
User Comments (0)
About PowerShow.com