L6 TS models Flashcards

1
Q

What is a strictly stationary process?

A

A stochastic process whose unconditional probability distribution does not change when shifted in time (see notes)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is a weakly stationary (ie. covariance stationary) process? What 3 conditions must it satisfy?

A

A strictly stationary process whereby the covariance can change over time
It must satisfy the following 3 equations:
1) E(yt) = μ for t=1 to infinity
2) E(yt-μ)(yt-μ)=σ^2 (ie. is constant and less than infinite)
3) E(yt1-μ)(yt2-μ)=γ(t2-t1) (covariance) for all t1 and t2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is γ(s)?

A

Autocovariance; the covariance between period t and period s

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the autocorrelation function (/correlogram)?

A

Plot of τs against s=0,1,2…

ie. shows the autocorrelation between the current period and period ‘s’ as you go further into the past

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is a white noise process?

A

A process with virtually no discernible structure

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What equations define a WNP?

A

1) E(yt)=μ
2) var(yt)=σ^2
3) γ(t-r) = 0 for all t is not equal to r (σ^2 otherwise)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What will a WNP ACF look like?

A

It will be 0 at all points apart from a single peak at of 1 (ie. correlation=1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What does the Box-Piece test test?

A

It tests the joint hypothesis that all m of the τk correlation coefficients are SIMULTANEOUSLY EQUAL TO ZERO

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How is the Q-statistics distributed?

A

Asymptotically as a chi-squared(m)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the condition for stationarity for an AR(p) model?

A

Condition for stationarity in an AR(p) model is that the roots of 1-Ø1z-Ø2z(2)-…-Øpz(p)=0 all lie outside the unit circle (see notes P1S2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

See

A

Notes P1S2 ‘testing for stationarity of an AR(p) model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is Wold’s decomposition theorem?

A

Any stationary AR(p) series can be decomposed into the sum of two uncorrelated processes; a purely deterministic part and a purely stochastic part, which will be a MA(infinity)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

If an AR model is stationary, what will its ACF (autocorrelation function) do?

A

Decay exponentially to zero

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

See and learn

A

Examples 3i, ii and iii in my notes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

See and learn

A

recursive structure of an AR(1) process (in notes P2S1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What does the PACF measure? How is it denoted?

A

Denoted τkk, it measures the correlation between an observation k periods ago, and the current observation, after controlling for observations at intermediate lags (ie. all lags less than k)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

When will the PACF=ACF?

A

At lag 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What is the PACF useful for?

A

Telling the difference between an AR process and an ARMA process
In the case of an AR(p) there are direct connections between yt and y(t-s) only for s is less than or equal to p; therefore AFTER lag p, the theoretical PACF will be zero

19
Q

How can an MA(q) be written and why?

A

As a AR(infinity) because there are direct connections between yt and all its previous values tf for MA(q) its theoretical PACF will be geometrically declining (see notes)

20
Q

What is an ARMA model?

A

ARMA (p,q) is made by combining the AR(p) and MA(q) models

21
Q

3 conditions an ARMA model satisfies?

A

E(ut)=0
E(ut^2)=σ^2
E(ut,us)=0 for all t not equal to s

22
Q

What is the invertibility condition?

A

The invertibility condition requires the MA(q) part of the model to have roots of θ(z)=0 greater than one (ie. outside the unit circle again)

23
Q

How will the ACF look for an ARMA model?

A

It will display combos of behaviour derived from both the AR and MA parts, but for lags beyond q, the ACF will simply be identical to the AR(p) model

24
Q

How does the ACF look for an AR(p) process? How do you tell the AR order?

A

Geometrically decaying

Number of spikes in PACF=AR order (p)

25
How does the PACF look for an MA(q) process? How do you tell the MA order?
Geometrically decaying PACF | Number of spikes in the ACF=MA order (q) (see slides 39-45 for examples)
26
What are the three steps to building ARMA models via the Box-Jenkins approach?
1) Identification 2) Estimation 3) Model diagnostic checking
27
What is involved in the identification step of the Box-Jenkins method?
Need to determine the order of the model using graphical procedures (note: now are better methods of doing this)
28
What is involved in the estimation step of the Box-Jenkins method?
Here we estimate the parameters of the model using either least squares of MLE (depending on the model)
29
What is involved in the model diagnostic checking step of the Box-Jenkins method?
2 methods of doing this: 1) deliberate overfitting and 2) residual diagnostics (learn what this actually means!)
30
What is an updated way of doing the identification step of Box-Jenkins?
Identification would not typically be done using ACFs since we want to form a parsimonious model Since the variance of estimators is inversely proportional to the number of DofF, this means that excessive models may be inclined to fit the features of the data -> MOTOVATION FOR INFORMATION CRITERIA!
31
What are the two key features of an information criteria?
1) It must have a term that is a function of the RSS | 2) There must be a penalty for adding extra parameters
32
How should we use an information criteria?
We should choose the criteria that minimises the information criteria
33
What are the three most popular information criteria?
1) Akaike's IC 2) Schwarz's Bayesian IC 3) Hannan-Quin IC
34
Comparison between SBIC and AIC?
SBIC embodies a stiffer penalty
35
2 characteristics of the SBIC? (pro and con)
Pro: strongly consistent Con: inefficient
36
2 cons of the AIC?
Not consistent, and will typically choose 'larger' models than the AIC
37
What is an ARIMA model?
I stands for integrated; an integrated AR process is one with a characteristic root on the unit circle
38
How will researchers typically deal with ARIMA models? What is the relationship between ARMA and ARIMA models?
They will difference the variable as necessary then build an ARMA model on the differenced variables - an ARMA (p,q) model in the variable differenced 'd' times is equivalent to an ARIMA (p,q,d) model on the original data?
39
What is exponential smoothing?
Exponential smoothing helps us to determine the weight we attach to previous observations when forecasting future ones - we expect recent observations to carry the most power in helping forecast future values of a series
40
See
Exponential smoothing in notes
41
3 reasons single/simple exponential smoothing does not work well with financial data?
1) There is little structure to smooth 2) It cannot allow for seasonality 3) Forecasts do not converge on the LT mean as s tends to infinity
42
How would we modify single exponential smoothing to allow for: a) trends? or b) seasonality?
Trend - Holt's method | Seasonality - Winter's method
43
2 advantages of exponential smoothing?
simple to use | easy to update model if new realisation becomes available (finish)