Models for Time Series Flashcards

white noise, moving average models, autoregressive models, mixed autoregressive moving average (ARMA) models, integrated models

1
Q

White Noise

Definition

A

-time series {Xt} is called white noise if the Xt are i.i.d. with E(Xt)=0 for all t and variance is a finite constant:
Var(Xt) = σ² < ∞
-to check this in practice we would compare the residuals of our model with the residuals we expect for white noise

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

White Noise

Mean

A

μ(t) = E(Xt) = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

White Noise

Autocovariance

A

γk = cov(Xt, Xt+k) = {σ², k=0 and 0, else

-since for white noise we model the Xt as independent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

White Noise

Autocorrelation

A

ρk = {1, k=0 and 0, else

-since Xt are independent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

White Noise

Remarks

A

1) often we assume Xt~N(0,σ²)
2) white noise is often used to model the residuals of more complicated time series
3) we usually denote a white noise process as {εt} with variance σε²
4) we can use correlogram to distinguish between white noise and processes with dependence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Bartlett’s Theorem

A

-if {Xt} is white noise, then for large n the distribution of ρk^, k≠0 is approximately N(0,1/n)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Identifying White Noise on a Correlogram

A
  • if a time series is white noise, 95% of the lines on the correlogram should lie between 1.96/+√n and 1.96/-√n
  • i.e. value |ρk^| > 1.96/√n are significant at the 5% level
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Moving Average Model

Definition

A

-a stochastic process {Xt} is called a moving average process of order q (or an MA(q) process) if:
Xt = Σ βk * εt-k
-sum from k=0 to k=q
-where βo,…,βq ∈ R and {εt} is white noise
-i.e. Xt can be written as the weighted average of near past white noise
-near because we expect q to be small since more distant events are unlikely to have impact on more current events

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Moving Average Model

Remarks

A

1) without loss of generality, we can assume βo=1 (since we can choose σε²=Var(εt)
2) since {εt} is stationary, {Xt} is stationary if q is finite (q

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

MA Process

Mean

A

μ(t) = E(Xt) = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

MA Process

γo

A

γo = (βo²+…+βq²) σε²

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

MA Process

γk

A

γk = {Σ βi βi+k σε² , 0≤k≤q and 0, else

-sum between i=0 and i=q-k

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

MA(0)

A

-an MA(0) process is white noise

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Moving Average Model

Finding β

A
  • in practice, when applying a moving average model to data we don’t know what the β values should be
  • to estimate the βs we can write β=β(ρ)
  • since we can estimate ρ from the data, this allows us to estimate β as well
  • if square root gives two values of β remember that since we don’t expect data to depends too much on the past we can choose the root that gives the smallest magnitude coefficient
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Autoregressive Model

Definition

A

-a stochastic process {Xt} is an autoregressive process of order p, AR(p), if:
Xt = Σ αk*Xt-k + εt, for all t
-sum from k=1 to k=p
-where α1,…,αp∈R, {εt} is white noise and εt is independent of Xs for all s

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Autoregressive Model

Remarks

A
  • when constructing process {Xt} the first p values need to be specified as an initial condition
  • we shall see that whether or not {Xt} is stationary depends on α1,…,αp and the initial conditions
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Random Walk

A

-an AR(1) process:

Xt = α*Xt-1 + εt

18
Q

Expectation of an AR(1) Process

A

Xt = α*Xt-1 + εt
- since E(εt)=0 :
E(Xt) = α E(Xt-1) = α^t E(X0)

19
Q

Variance of an AR(1) Process

A
Xt = α*Xt-1 + εt
Var(Xt) = α²Var(Xt-1) + σε²
20
Q

When is a general AR(1) process weakly stationary?

A

Xt = α*Xt-1 + εt

  • a general AR(1) process is weakly stationary when:
    1) |α|<1
    2) E(Xt) = 0 for all t
    3) Var(Xt) = σε² / [1-α²] for all t including t=0
21
Q

Stationarity of AR(p) Proposition

A

-an AR(p) process is stationary if and only if all the roots y1,…,yp of the equation:
α(y) = 1 - α1y - … - αpy^p
-are such that |yi|>1

22
Q

The Backshift Operator

Definition

A

-the backshift operator, B, is defined by:

B Xt = Xt-1, for all t

23
Q

Step 1 - AR(p) Process in Terms of the Backshift Operator

A

Xt = α1Xt-1 + … + αpXt-p + εt

= α1*B(Xt) + ... + α1*p*B^p(Xt) + εt
=>
(1 - α1*B - ... - α1*p*B^p)Xt = εt
=>
Xt = [1 - Σ αk*B^k]^(-1) εt
-sum from k=1 to k=0
-apply a binomial expansion to the bracket
24
Q

Step 2 - Define and Find ck

A
-we have 
Xt = [1 - Σ αk*B^k]^(-1) εt
-let:
α(B) = εt Σ ck*B^k
-then
Xt = α(B)^-1 εt
-to get rid of the inverse, we shall find numbers ck such that 
1/α(y) = Σ ck*y^k
-for an AR(p) process,
ck = A1/y1^k + ... + Ap/yp^l
25
Step 3 - assume Xt is weakly stationary
Var(Xt) = (Σ ck²) σε² -for this to exist, i.e. be a finite constant, we need: Σ ck² < ∞ Σ (A1/y1^k + ... + Ap/yp^l)² < ∞ -an AR(p) process is stationary <=> |yi|>1 for all i=1,...,p
26
Stationarity of AR(2)
-for an AR(2) process, we have: α(y) = 1 - α1*y - α2*y² -roots given by the quadratic formula -if α1²+4α2>0 we have two real roots -if α1²+4α2<0, we have two complex roots -if α2>-1 AND α2<1-α1 AND α2<1+α1 then the process is stationary
27
Stationarity of AR(p), p>2
-for AR(p) with p>2, use computer to find the roots
28
Autocovariance of AR(p)
γk = Σ αj * γk-j - for all k≥1 - sum from j=1 to j=p
29
Autocorrelation of AR(p)
ρk = Σ αj * ρk-j | -sum from j=1 to j=p
30
Yule Walker Equations
-used to determine α1,α2,...,αp from ρ1,ρ2,...,ρp for AR(p) models ρ1 = 1*α1 + ρ1*α2 + ... + ρp-1*αp ρ2 = ρ1*α1 + 1*α2 + ... + ρp-2*αp ... ρp = ρp-1*α1 + ρp-2*α2 + ... + 1*αp -from the data we can estimate the ρs and then use these equations to calculate the αs
31
AR(p) Model Fitting Steps
- to fit an AR(p) model to data {Xt}=(X1,...,Xn): 1) subtract the trend and the seasonal effects from {Xt} to obtain residuals {Yt} 2) estimate the acf of Y to obtain ρ1^,ρ2^,...,ρp^ 3) solve Yule-Walker equations for α1^,...,αp^ 4) consider the residuals Zt = Yt - α1^*Yt-1 - ... - αp^*Yt-p use Bartlett bands to check if {Zt} are (approx.) white noise, else the model is not a good fit for the data 5) use sample variance of {Zt} to estimate σε² 6) add trend and seasonal effects back on to conclusions about {Yt} to get conclusions for {Xt}
32
ARMA Model | Definition
-an ARMA(p,q) model satisfies: Xt = Σ αi*Xt-i + εt + Σ βj*εt-j -with εt independent of Xt-1,Xt-2,... -sum from i=1 to i=p and j=1 to j=q
33
ARMA Model | In Terms of the Backshift Operator
``` α(B) Xt = β(B) εt -where: α(y) = 1 - Σ αi*y^i -sum from i=1 to i=p -and: β(y) = 1 + Σ βj*y^j -sum from j=1 to j=q ```
34
ARMA Model | Weak Stationarity
``` -as for AR(p), can write: Xt = α(B)^(-1) β(B) εt -weakly stationary if: Xt = Σ λk*β(B)*εt-k = Σ λk~*εt-k -with Σ λk~ < ∞ -sums from k=0 to k=∞ -equivalent to the roots of α() lying outside the complex unit circle ```
35
ARMA Model | Stationarity and Expectation
-if stationary, we have: E(Xt) = Σ λk~*E(εt-k) = 0 -sum from k=0 to k=∞
36
ARMA Model | Reconstruct Noise as a Function of the Data and Invertibility
εt = β(B)^(-1) * α(B) * Xt = Σ δk*Xt-k - sum from k=0 to k=∞ - if Σ δk² < ∞, then the process is invertible - equivalently, {Xt} is invertible if the roots of β lie outside the complex unit circle
37
ARMA Model | Autocovariance
-autocovariance of an ARMA(p,q) process is given by: γk = Σ αi * γk-i , k>q -sum from i=0 to i=p -recall that an AR(p) process satisfies the same relation but for all k>0, so AR and ARMA model show the same behaviour for k>q
38
Difference Operator | Definition
∇(Xt) = Xt - Xt-1 , for all t | -where ∇ is the difference operator
39
Difference Operator | Backshift Operator and Stationarity
∇ = 1 - B | -if {Xt} has stationary increments then ∇X is stationary
40
Difference Operator | Constant Mean and Linear Trend
-applying the difference operator to a series with constant mean removes the mean: ∇(Xt + μ) = Xt + μ - (Xt-1 + μ) = Xt - Xt-1 = ∇Xt -applying the difference operator to a linear trend converts it to a constant mean: ∇(Xt + a + bt) = Xt + a + bt - (Xt-1 + a + bt) = Xt - Xt-1 + b = ∇Xt + b
41
ARIMA Model | Definition
-autoregressive integrated moving average process | {Xt} is an ARIMA(p,d,q) process if ∇^d Xt is stationary ARMA(p,q) process
42
ARIMA Model | In Terms of ARMA Model
-an ARIMA(p,d,q) process {Xt} can be written as an ARMA(p+d,q) process which has a unit root and hence is non-stationary for d>0