STAT 497 LECTURE NOTE 11 - PowerPoint PPT Presentation

1 / 56
About This Presentation
Title:

STAT 497 LECTURE NOTE 11

Description:

STAT 497 LECTURE NOTE 11 VAR MODELS AND GRANGER CAUSALITY * GRANGER CAUSALITY In principle, the concept is as follows: If X causes Y, then, changes of X happened ... – PowerPoint PPT presentation

Number of Views:134
Avg rating:3.0/5.0
Slides: 57
Provided by: ceylan
Category:

less

Transcript and Presenter's Notes

Title: STAT 497 LECTURE NOTE 11


1
STAT 497LECTURE NOTE 11
  • VAR MODELS AND GRANGER CAUSALITY

2
VECTOR TIME SERIES
  • A vector series consists of multiple single
    series.
  • Why we need multiple series?
  • To be able to understand the relationship between
    several components
  • To be able to get better forecasts

3
VECTOR TIME SERIES
  • Price movements in one market can spread easily
    and instantly to another market. For this reason,
    financial markets are more dependent on each
    other than ever before. So, we have to consider
    them jointly to better understand the dynamic
    structure of global market. Knowing how markets
    are interrelated is of great importance in
    finance.
  • For an investor or a financial institution
    holding multiple assets play an important role in
    decision making.

4
VECTOR TIME SERIES

5
VECTOR TIME SERIES

6
VECTOR TIME SERIES

7

8

9
VECTOR TIME SERIES
  • Consider an m-dimensional time series
    Yt(Y1,Y2,,Ym). The series Yt is weakly
    stationary if its first two moments are time
    invariant and the cross covariance between Yit
    and Yjs for all i and j are functions of the time
    difference (s?t) only.

10
VECTOR TIME SERIES
  • The mean vector
  • The covariance matrix function

11
VECTOR TIME SERIES
  • The correlation matrix function
  • where D is a diagonal matrix in which the i-th
    diagonal element is the variance of the i-th
    process, i.e.
  • The covariance and correlation matrix functions
    are positive semi-definite.

12
VECTOR WHITE NOISE PROCESS
  • atWN(0,?) iff at is stationary with mean 0
    vector and

13
VECTOR TIME SERIES
  • Yt is a linear process if it can be expressed
    as
  • where ?j is a sequence of mxn matrix whose
    entries are absolutely summable, i.e.

14
VECTOR TIME SERIES
  • For a linear process, E(Yt)0 and

15
MA (WOLD) REPRESENTATION
  • For the process to be stationary, ?s should be
    square summable in the sense that each of the mxm
    sequence ?ij.s is square summable.

16
AR REPRESENTATION
  • For the process to be invertible, ?s should be
    absolute summable.

17
THE VECTOR AUTOREGRESSIVE MOVING AVERAGE (VARMA)
PROCESSES
  • VARMA(p,q) process

18
VARMA PROCESS
  • VARMA process is stationary, if the zeros of
    ?p(B) are outside the unit circle.
  • VARMA process is invertible, if the zeros of
    ?q(B) are outside the unit circle.

19
IDENTIFIBILITY PROBLEM
  • Multiplying matrices by some arbitrary matrix
    polynomial may give us an identical covariance
    matrix. So, the VARMA(p,q) model is not
    identifiable. We cannot uniquely determine p and
    q.

20
IDENTIFIBILITY PROBLEM
  • Example VARMA(1,1) process

MA(?)VMA(1)
21

22
IDENTIFIBILITY
  • To eliminate this problem, there are three
    methods suggested by Hannan (1969, 1970, 1976,
    1979).
  • From each of the equivalent models, choose the
    minimum MA order q and AR order p. The resulting
    representation will be unique if Rank(?p(B))m.
  • Represent ?p(B) in lower triangular form. If the
    order of ?ij(B) for i,j1,2,,m, then the model
    is identifiable.
  • Represent ?p(B) in a form ?p(B) ?p(B)I where
    ?p(B) is a univariate AR(p). The model is
    identifiable if ?p?0.

23
VAR(1) PROCESS
  • Yi,t depends not only the lagged values of Yit
    but also the lagged values of the other
    variables.
  • Always invertible.
  • Stationary if outside the
    unit circle. Let ?B?1.

The zeros of I??B is related to the eigenvalues
of ?.
24
VAR(1) PROCESS
  • Hence, VAR(1) process is stationary if the
    eigenvalues of ? ?i, i1,2,,m are all inside
    the unit circle.
  • The autocovariance matrix

25
VAR(1) PROCESS
  • k1,

26
VAR(1) PROCESS
  • Then,

27
VAR(1) PROCESS
  • Example

The process is stationary.
28
VMA(1) PROCESS
  • Always stationary.
  • The autocovariance function
  • The autocovariance matrix function cuts of after
    lag 1.

29
VMA(1) PROCESS
  • Hence, VMA(1) process is invertible if the
    eigenvalues of ? ?i, i1,2,,m are all inside
    the unit circle.

30
IDENTIFICATION OF VARMA PROCESSES
  • Same as univariate case.
  • SAMPLE CORRELATION MATRIC FUNCTION Given a
    vector series of n observations, the sample
    correlation matrix function is
  • where s are the crosscorrelation for
    the i-th and j-th component series.
  • It is very useful to identify VMA(q).

31
SAMPLE CORRELATION MATRIC FUNCTION
  • Tiao and Box (1981) They have proposed to use
    ,? and . signs to show the significance of the
    cross correlations.
  • sign the value is greater than 2 times the
    estimated standard error
  • sign the value is less than 2 times the
    estimated standard error
  • . sign the value is within the 2 times estimated
    standard error

32
PARTIAL AUTOREGRESSION OR PARTIAL LAG CORRELATION
MATRIX FUNCTION
  • They are useful to identify VAR order. The
    partial autoregression matrix function is
    proposed by Tiao and Box (1981) but it is not a
    proper correlation coefficient. Then, Heyse and
    Wei (1985) have proposed the partial lag
    correlation matrix function which is a proper
    correlation coefficient. Both of them can be used
    to identify the VARMA(p,q).

33
GRANGER CAUSALITY
  • In time series analysis, sometimes, we would like
    to know whether changes in a variable will have
    an impact on changes other variables.
  • To find out this phenomena more accurately, we
    need to learn more about Granger Causality Test.

34
GRANGER CAUSALITY
  • In principle, the concept is as follows
  • If X causes Y, then, changes of X happened first
    then followed by changes of Y.

35
GRANGER CAUSALITY
  • If X causes Y, there are two conditions to be
    satisfied
  • 1. X can help in predicting Y. Regression of X on
    Y has a big R2
  • 2. Y can not help in predicting X.

36
GRANGER CAUSALITY
  • In most regressions, it is very hard to discuss
    causality. For instance, the significance of the
    coefficient ? in the regression
  • only tells the co-occurrence of x and y, not
    that x causes y.
  • In other words, usually the regression only tells
    us there is some relationship between x and y,
    and does not tell the nature of the relationship,
    such as whether x causes y or y causes x.

37
GRANGER CAUSALITY
  • One good thing of time series vector
    autoregression is that we could test causality
    in some sense. This test is first proposed by
    Granger (1969), and therefore we refer it Granger
    causality.
  • We will restrict our discussion to a system of
    two variables, x and y. y is said to
    Granger-cause x if current or lagged values of y
    helps to predict future values of x. On the other
    hand, y fails to Granger-cause x if for all s gt
    0, the mean squared error of a forecast of xts
    based on (xt, xt-1, . . .) is the same as that is
    based on (yt, yt-1, . . .) and (xt, xt-1, . . .).

38
GRANGER CAUSALITY
  • If we restrict ourselves to linear functions, x
    fails to Granger-cause x if
  • Equivalently, we can say that x is exogenous in
    the time series sense with respect to y, or y is
    not linearly informative about future x.

39
GRANGER CAUSALITY
  • A variable X is said to Granger cause another
    variable Y, if Y can be better predicted from the
    past of X and Y together than the past of Y
    alone, other relevant information being used in
    the prediction (Pierce, 1977).

40
GRANGER CAUSALITY
  • In the VAR equation, the example we proposed
    above implies a lower triangular coefficient
    matrix
  • Or if we use MA representations,

41
GRANGER CAUSALITY
  • Consider a linear projection of yt on past,
    present and future xs,
  • where E(etx? ) 0 for all t and ?. Then y fails
    to Granger-cause x iff dj 0 for j 1, 2, . . ..

42
TESTING GRANGER CAUSALITY
  • Procedure
  • 1) Check that both series are stationary in mean,
    variance and covariance (if necessary transform
    the data via logs, differences to ensure this)
  • 2) Estimate AR(p) models for each series, where p
    is large enough to ensure white noise residuals.
    F tests and other criteria (e.g. Schwartz or
    Akaike) can be used to establish the maximum lag
    p that is needed.
  • 3) Re-estimate both model, now including all the
    lags of the other variable
  • 4) Use F tests to determine whether, after
    controlling for past Y, past values of X can
    improve forecasts Y (and vice versa)

43
TEST OUTCOMES
  • 1. X Granger causes Y but Y does not Granger
    cause X
  • 2. Y Granger causes X but X does not Granger
    cause Y
  • 3. X Granger causes Y and Y Granger causes X
    (i.e., there is a feedback system)
  • 4. X does not Granger cause Y and Y does not
    Granger cause X

44
TESTING GRANGER CAUSALITY
  • The simplest test is to estimate the regression
    which is based on
  • using OLS and then conduct a F-test of the null
    hypothesis
  • H0 ?1 ?2 . . . ?p 0.

45
TESTING GRANGER CAUSALITY
  • 2.Run the following regression, and calculate RSS
    (full model)
  • 3.Run the following limited regression, and
    calculate RSS (Restricted model).

46
TESTING GRANGER CAUSALITY
  • 4.Do the following F-test using RSS obtained from
    stages 2 and 3
  • F (n-k) /q .(RSSrestricted-RSSfull) /
    RSSfull
  • n number of observations
  • k number of parameters from full model
  • q number of parameters from restricted model

47
TESTING GRANGER CAUSALITY
  • 5. If H0 rejected, then X causes Y.
  • This technique can be used in investigating
    whether or not Y causes X.

48
Example of the Usage of Granger Test
  • World Oil Price and Growth of US Economy
  • Does the increase of world oil price influence
    the growth of US economy or does the growth of US
    economy effects the world oil price?
  • James Hamilton did this study using the following
    model
  • Zt a0 a1 Zt-1...amZt-mb1Xt-1 bmXt-met
  • Zt ?Pt changes of world price of oil
  • Xt log (GNPt/ GNPt-1)

49
World Oil Price and Growth of US Economy
  • There are two causalities that need to be
    observed
  • (i) H0 Growth of US Economy does not influence
    world oil price
  • Full
  • Zt a0 a1 Zt-1...amZt-mb1Xt-1 bmXt-met
  • Restricted
  • Zt a0 a1 Zt-1...amZt-m et

50
World Oil Price and Growth of US Economy
  • (ii) H0 World oil price does not influence
    growth of US Economy
  • Full
  • Xt a0 a1 Xt-1 amXt-m b1Zt-1bmZt-m et
  • Restricted
  • Xt a0 a1 Xt-1 amXt-m et

51
World Oil Price and Growth of US Economy
  • F Tests Results
  • 1. Hypothesis that world oil price does not
    influence US economy is rejected. It means that
    the world oil price does influence US economy .
  • 2. Hypothesis that US economy does not affect
    world oil price is not rejected. It means that
    the US economy does not have effect on world oil
    price.

52
World Oil Price and Growth of US Economy
  • Summary of James Hamiltons Results

Null Hypothesis (H0) (I)F(4,86) (II)F(8,74)
I. Economic growth ??World Oil Price 0.58 0.71
II. World Oil Price??Economic growth 5.55 3.28
53
World Oil Price and Growth of US Economy
  • Remark The first experiment used the data
    1949-1972 (95 observations) and m4 while the
    second experiment used data 1950-1972 (91
    observations) and m8.

54
Chicken vs. Egg
  • This causality test is also can be used in
    explaining which comes first chicken or egg.
    More specifically, the test can be used in
    testing whether the existence of egg causes the
    existence of chicken or vise versa.
  • Thurman and Fisher did this study using yearly
    data of chicken population and egg productions in
    the US from 1930 to1983
  • The results
  • 1. Egg causes the chicken.
  • 2. There is no evidence that chicken causes egg.

55
Chicken vs. Egg
  • Remark Hypothesis that egg has no effect on
    chicken population is rejected while the other
    hypothesis that chicken has no effect on egg is
    not rejected. Why?

56
GRANGER CAUSALITY
  • We have to be aware of that Granger causality
    does not equal to what we usually mean by
    causality. For instance, even if x1 does not
    cause x2, it may still help to predict x2, and
    thus Granger-causes x2 if changes in x1 precedes
    that of x2 for some reason.
  • A naive example is that we observe that a
    dragonfly flies much lower before a rain storm,
    due to the lower air pressure. We know that
    dragonflies do not cause a rain storm, but it
    does help to predict a rain storm, thus
    Granger-causes a rain storm.
Write a Comment
User Comments (0)
About PowerShow.com