Lecture 4a: ARMA Model 1 Big Picture • Most often our goal is to find a statistical model to describe real time series (estimation), and then predict the future (forecasting) • One particularly popular model is ARMA model • Using ARMA model to describe real time series is called Box-Jenkins Methodology • However, ARMA model cannot be applied to any time series. The ideal series should be stationary and ergodic! 2 (Weak or Covariance) Stationarity { } A time series y is (weakly or covariance) stationary if it satisfies t ∀ the following properties: t Ey = µ (1) t 2 var(y ) = σ (2) t y cov(y , y ) depends only on j (3) t t(cid:0)j So in words, a stationary series has constant (time-invariant) mean, constant variance, and its (auto)covariance depends on the lag only. A series is nonstationary if any of those is violated. 3 Why Stationarity? • Stationarity is a nice property. • Suppose we have two samples of data for the same time series. One sample is from year 1981 to 1990; the other is from year 1991 to 2000. If the series is stationary, then the two samples should give us approximately the same estimates for mean, variance and covariance since those moments do not depend on t under stationarity. • If, on the other hand, the two samples produce significantly different estimates, then the series is most likely to have structural breaks and is nonstationary. • Using history to forecast future cannot be justified unless stationarity holds. 4 Example of Nonstationary Series Consider a trending series given by y = at + e t t where e is white noise. This trending series is nonstationary because t the expectation value is Ey = at, t which changes with t, and is not constant 5 Ergodicity • A time series is ergodic if, as the lag value increases, its autocovariance decays to zero fast enough. → → ∞ cov(y , y ) 0 fast enough as j t t(cid:0)j • For a series which is both stationary and ergodic, the law of large number holds ∑T 1 → → ∞ y E(y ), as T t t T t=1 • Later we will learn that a unit root process is not ergodic, so the law of large number cannot be applied to a unit root process. 6 Why Ergodicity? Ergodicity is the second nice property. In reality we only have sample of data; we do not observe the population. Then we may wonder whether we can get more and more accurate estimates about the unknown moments when the sample gets bigger and bigger. The answer is yes if the series is ergodic. 7 BoxJenkins Methodology • We first derive the dynamic properties of a hypothetical ARMA process • We next find the dynamic pattern of a real time series • Finally we match the real time series with the possibly closest hypothetical ARMA process (model). 8 Review: Expectation and Variance Let X, Y be two random variables, and a, b are constant numbers. We have E(aX + bY ) = aE(X) + bE(Y ) (4) ≡ − 2 2 − 2 var(X) E[(X EX) ] = E(X ) (EX) (5) 2 var(aX) = a var(X) (6) ≡ − − cov(X, Y ) E[(X EX)(Y EY )] (7) cov(X, X) = var(X) (8) ⇔ cov(X, Y ) = 0 X, Y are uncorrelated (9) 2 2 var(aX + bY ) = a var(X) + b var(Y ) + 2abcov(X, Y ) (10) 9 MA(1) Process • Consider a first order moving-average MA(1) process: y = e + θ e t t 1 t(cid:0)1 where e is white noise: Ee = 0; var(e ) = σ2; and t t t e cov(e , e ) = 0. t t(cid:0)j • Find Ey = t • Find var(y ) = t • Find cov(y , y ) = t t(cid:0)1 • Find cov(y , y ) = t t(cid:0)2 • Is MA(1) process stationary or ergodic? 10
Description: