Univariate Time Series - PowerPoint PPT Presentation

1 / 52
About This Presentation
Title:

Univariate Time Series

Description:

Unit root test has null hypothesis that series is non-stationary ... Lagged value can explain current value well all the time except for period of shift ... – PowerPoint PPT presentation

Number of Views:109
Avg rating:3.0/5.0
Slides: 53
Provided by: alanm160
Category:

less

Transcript and Presenter's Notes

Title: Univariate Time Series


1
Univariate Time Series
2
Concerned with time series properties of single
series
  • Denote yt to be observed value in period t
  • Observations run from 1 to T
  • Very likely that observations at different points
    in time are correlated as economic time series
    change only slowly

3
Stationarity
  • Properties of estimators depend on whether series
    is stationary or not
  • Time series yt is stationary if its probability
    density function does not depend on time i.e. pdf
    of (ys,ys1,ys2,..yst ) does not depend on s
  • Implies
  • E(yt) does not depend on t
  • Var(yt) does not depend on t
  • Cov(yt,yts) depends on s and not t

4
Weak Stationarity
  • A series has weak stationarity if first and
    second moments do not depend on t
  • Stationarity implies weak stationarity but
    converse not necessarily true
  • Will be the case for most cases you will see
  • I will focus on weak stationarity when proving
    stationarity

5
Simplest Stationary Process
  • Where et is white noise iid with mean 0 and
    variance s2
  • Simple to check that
  • E(yt)a0
  • Var(yt) s2
  • Cov(yt,yt-s)0
  • Implies yt is white noise unlikely for most
    economic time series

6
First-Order Autoregressive Process AR(1)
7
When is AR(1) stationary?
  • Take expectations
  • If stationary then can write this as
  • Only makes sense if a1lt1

8
Look at Variance
  • If stationary then
  • Only makes sense if a1 lt1 this is the
    condition for stationarity of AR(1) process
  • If a1 1 then this is random walk without drift
    if a0 0, with drift otherwise variance grows
    over time

9
What about covariances?
  • Define
  • Then can write model as

10
  • Or can write in terms of correlation coefficient

11
Higher-Order Covariances
  • Or, in terms of correlation coefficient
  • General rule is

12
More General Auto-Regressive Processes
  • AR(p) can be written as
  • This is stationary if root of p-th order
    polynomial are all inside unit circle
  • Necessary condition is

13
Why is this condition necessary?
  • Think of taking expectations
  • If stationary, expectations must all be equal
  • Only makes sense if coefficients sum to lt1

14
Moving-Average Processes
  • Most common alternative to an AR process MA(1)
    can be written as
  • MA process will always be stationary can see
    weak stationarity from

15
Stationarity of MA Process
  • And further covariances are zero

16
MA(q) Process
  • Will always be stationary
  • Covariances between two observations zero if more
    than q periods apart

17
Relationship between AR and MA Processes
  • Might seem unrelated but connection between the
    two
  • Consider AR(1) with a00
  • Substitute yt-1 to give

18
Do Repeated Substitution
  • AR(1) can be written as an MA(8) with
    geometrically declining weights
  • Need stationarity for final term to disappear

19
A quicker way to get thisthe lag operator
  • Define the lag operator
  • Can write AR(1) as

20
  • Should recognize denominator as sum of geometric
    series so
  • Which is the same as we had by repeated
    substitution

21
For a general AR(p) process
  • If a(L) invertible we have
  • So invertible AR(p) can be written as particular
    MA(8)

22
From an MA to an AR Process
  • Can use lag operator to write MA(q) as
  • If ?(L) invertible
  • So MA(q) can be written as particular AR(8)

23
ARMA Processes
  • Time series might have both AR and MA components
  • ARMA(p,q) can be written as

24
Estimation of AR Models
  • Simple-minded approach would be to run regression
    of y on p lags of y and use OLS estimates
  • Lets consider properties of this estimator
    simplest to start with AR(1)
  • Assume y0 is available

25
The OLS estimator of the AR Coefficient
  • Want to answer questions about bias, consistency,
    variance etc
  • Have to regard regressor as stochastic as
    lagged value of dependent variable

26
Bias in OLS Estimate
  • Cant derive explicit expression for bias
  • But OLS estimate is biased and bias is negative
  • Different from case where x independent of every
    e
  • Easiest way to see the problem is consider
    expectation of numerator in second term

27
  • First part has expectation zero but second part
    can be written as
  • These terms are not zero as yt can be written as
    function of lagged et
  • All these correlations are positive so this will
    be positive and bias will be negative
  • This bias often called Hurwicz bias can be
    sizeable in small samples

28
But..
  • Hurwicz bias goes to zero as T?8
  • Can show that OLS estimate is consistent
  • What about asymptotic distribution of OLS
    estimator?
  • Depends on whether yt is stationary or not

29
If time series is stationary
  • OLS estimator is asymptotically normal with usual
    formulae for asymptotic variance e.g. for AR(1)
  • Can write this as

30
The Initial Conditions Problem
  • To estimate AR(p) model by OLS does not use
    information contained in first p observations
  • Loss of efficiency from this
  • Number of methods for using this information
    will describe ML method for AR(1)

31
ML Estimation of AR(1) Process
  • Need to write down likelihood function
    probability of outcome given parameters
  • Can always factorize joint density as
  • With AR(1) only first lag is any use

32
Assume et has normal distribution
  • Then yt tgt1, is normally distributed with mean
    (a0a1yt-1) and variance s2
  • y0 is normally distributed with mean
    (a0/(1-a1)) and variance s2(1-a12)
  • Hence likelihood function can be written as

33
Comparison of ML and OLS Estimators
  • Maximization of first part leads to OLS estimate
    you should know this
  • Initial condition will cause some deviation from
    OLS estimate
  • Effect likely to be small if T reasonably large

34
Estimation of MA Processes
  • MA(1) process looks simple but estimation
    surprisingly complicated
  • To do it properly requires Kalman Filter
  • Dirty Method assumes e00
  • Then repeated iteration leads to

35
Can then end up with..
  • And maximize with respect to parameters
  • Packages like STATA, EVIEWS have modules for
    estimating MA processes

36
Deterministic Trends
  • Restriction to stationary processes very limiting
    as many economic time series have clear trends in
    them
  • But results can be modified if deterministic
    trend as series stationary about this

37
Non-Stationary Series
  • Will focus attention on random walk
  • Note that conditional on some initial value y0 we
    have

38
Terminologies
  • These formulae should make clear non-stationarity
    of random walk
  • Different terminologies used to describe
    non-stationary series
  • yt is a random walk
  • yt has a stochastic trend
  • yt is I(1) integrated of order 1 -?yt is
    stationary
  • yt has a unit root
  • All mean the same

39
Problems Caused by Unit Roots - Bias
  • Autoregressive Coefficients biased towards zero
  • Same problem as for stationary AR process but
    problem bigger
  • But bias goes to zero as T?8 so consistent

40
Problems Caused by Unit Roots Non-Normal
Asymptotic Distribution
  • Asymptotic distribution of OLS estimator is
  • Non-normal
  • Shifted to the left of true value (1)
  • Long left tail
  • Convergence relates to
  • Cannot assume t-statistic has normal distribution
    in large samples

41
Testing for a Unit Root the Basics
  • Interested in H0a11 against H1a1lt1
  • Use t-statistic but do not use normal tables for
    confidence intervals etc
  • Have to use Dickey-Fuller Tables called the
    Dickey-Fuller Test

42
Implementing the Dickey-Fuller Test
  • Want to test H0ß10 against H1 ß1lt0
  • Estimate by OLS and form t-statistic in usual way
  • But use t-statistic in different way
  • Interested in one-tail test
  • Distribution not normal

43
Critical Values for the Dickey-Fuller Test
  • Typically larger than for normal distribution
    (long left tail)
  • Critical values differ according to
  • the sample size (typically small sample critical
    values are based on the assumption of normally
    distributed errors)
  • whether constant is included in the regression or
    not
  • Whether a trend is included in the regression or
    not
  • the order of the AR process that is being
    estimated
  • Reflects fact that distribution of t-statistic
    varies with all these things
  • Most common cases have been worked out and
    tabulated often embedded in packages like
    STATA, EVIEWS

44
Some Examples of Large-Sample Critical Values for
DF Test
45
The Augmented Dickey-Fuller Test
  • Critical values for unit root test depend on
    order of AR process
  • DF test for AR(p) often called ADF Test
  • Model and hypothesis is

46
The easiest way to implement
  • Can always re-write AR(p) process as
  • i.e. regression of ?yt on (p-1) lags in ?yt and
    yt-1
  • Test hypothesis that coefficient on is zero
    against alternative it is less than zero

47
See this for AR(2)
48
ARIMA Processes
  • One implication of above is that AR(p) process
    with a unit root can always be written as AR(p-1)
    process in differences
  • Such processes are often called auto-regressive
    integrated moving average processes ARIMA(p,d,q)
    where the d refers to how many times the data is
    differenced before it is an ARMA(p,q)

49
Some Caution about Unit Root Tests
  • Unit root test has null hypothesis that series is
    non-stationary
  • This is because null of stationarity not
    well-defined
  • accepting hypothesis of unit root implies data
    consistent with non-stationarity but may also
    be consistent with stationarity
  • economic theory may often provide guidance

50
Structural Breaks
  • Suppose series looks like

51
  • This is stationary with a mean shift in middle of
    sample
  • If did not model structural break would easily
    conclude there was a unit root
  • OLS estimate of AR(1) coefficient is 0.95
  • Passes Dickey-Fuller test
  • Lagged value can explain current value well all
    the time except for period of shift
  • Structural breaks and unit roots may be hard to
    distinguish

52
Other Regressors
  • Have only discussed univariate time series
  • Usually more interested in correlations between
    variables
  • This is next part of the course
Write a Comment
User Comments (0)
About PowerShow.com