Models for Time Series Flashcards
white noise, moving average models, autoregressive models, mixed autoregressive moving average (ARMA) models, integrated models
White Noise
Definition
-time series {Xt} is called white noise if the Xt are i.i.d. with E(Xt)=0 for all t and variance is a finite constant:
Var(Xt) = σ² < ∞
-to check this in practice we would compare the residuals of our model with the residuals we expect for white noise
White Noise
Mean
μ(t) = E(Xt) = 0
White Noise
Autocovariance
γk = cov(Xt, Xt+k) = {σ², k=0 and 0, else
-since for white noise we model the Xt as independent
White Noise
Autocorrelation
ρk = {1, k=0 and 0, else
-since Xt are independent
White Noise
Remarks
1) often we assume Xt~N(0,σ²)
2) white noise is often used to model the residuals of more complicated time series
3) we usually denote a white noise process as {εt} with variance σε²
4) we can use correlogram to distinguish between white noise and processes with dependence
Bartlett’s Theorem
-if {Xt} is white noise, then for large n the distribution of ρk^, k≠0 is approximately N(0,1/n)
Identifying White Noise on a Correlogram
- if a time series is white noise, 95% of the lines on the correlogram should lie between 1.96/+√n and 1.96/-√n
- i.e. value |ρk^| > 1.96/√n are significant at the 5% level
Moving Average Model
Definition
-a stochastic process {Xt} is called a moving average process of order q (or an MA(q) process) if:
Xt = Σ βk * εt-k
-sum from k=0 to k=q
-where βo,…,βq ∈ R and {εt} is white noise
-i.e. Xt can be written as the weighted average of near past white noise
-near because we expect q to be small since more distant events are unlikely to have impact on more current events
Moving Average Model
Remarks
1) without loss of generality, we can assume βo=1 (since we can choose σε²=Var(εt)
2) since {εt} is stationary, {Xt} is stationary if q is finite (q
MA Process
Mean
μ(t) = E(Xt) = 0
MA Process
γo
γo = (βo²+…+βq²) σε²
MA Process
γk
γk = {Σ βi βi+k σε² , 0≤k≤q and 0, else
-sum between i=0 and i=q-k
MA(0)
-an MA(0) process is white noise
Moving Average Model
Finding β
- in practice, when applying a moving average model to data we don’t know what the β values should be
- to estimate the βs we can write β=β(ρ)
- since we can estimate ρ from the data, this allows us to estimate β as well
- if square root gives two values of β remember that since we don’t expect data to depends too much on the past we can choose the root that gives the smallest magnitude coefficient
Autoregressive Model
Definition
-a stochastic process {Xt} is an autoregressive process of order p, AR(p), if:
Xt = Σ αk*Xt-k + εt, for all t
-sum from k=1 to k=p
-where α1,…,αp∈R, {εt} is white noise and εt is independent of Xs for all s
Autoregressive Model
Remarks
- when constructing process {Xt} the first p values need to be specified as an initial condition
- we shall see that whether or not {Xt} is stationary depends on α1,…,αp and the initial conditions
Random Walk
-an AR(1) process:
Xt = α*Xt-1 + εt
Expectation of an AR(1) Process
Xt = α*Xt-1 + εt
- since E(εt)=0 :
E(Xt) = α E(Xt-1) = α^t E(X0)
Variance of an AR(1) Process
Xt = α*Xt-1 + εt Var(Xt) = α²Var(Xt-1) + σε²
When is a general AR(1) process weakly stationary?
Xt = α*Xt-1 + εt
- a general AR(1) process is weakly stationary when:
1) |α|<1
2) E(Xt) = 0 for all t
3) Var(Xt) = σε² / [1-α²] for all t including t=0
Stationarity of AR(p) Proposition
-an AR(p) process is stationary if and only if all the roots y1,…,yp of the equation:
α(y) = 1 - α1y - … - αpy^p
-are such that |yi|>1
The Backshift Operator
Definition
-the backshift operator, B, is defined by:
B Xt = Xt-1, for all t
Step 1 - AR(p) Process in Terms of the Backshift Operator
Xt = α1Xt-1 + … + αpXt-p + εt
= α1*B(Xt) + ... + α1*p*B^p(Xt) + εt => (1 - α1*B - ... - α1*p*B^p)Xt = εt => Xt = [1 - Σ αk*B^k]^(-1) εt -sum from k=1 to k=0 -apply a binomial expansion to the bracket
Step 2 - Define and Find ck
-we have Xt = [1 - Σ αk*B^k]^(-1) εt -let: α(B) = εt Σ ck*B^k -then Xt = α(B)^-1 εt -to get rid of the inverse, we shall find numbers ck such that 1/α(y) = Σ ck*y^k -for an AR(p) process, ck = A1/y1^k + ... + Ap/yp^l