17. Time series analysus Flashcards

1
Q

Define stationarity

A
  • Determines extent to which can use past data to model future
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

List the 4 types of stationarity

A
  • Strict stationarity
  • Weak stationarity
  • Covariance stationarity
  • Trend stationarity
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Outline the 4 types of stationarity

A
  • Strict stationarity - Characteristics do not change over time
  • Weak stationarity - Weak stationarity of order n if moments of subsets of the process = and finite up to nth moment
  • Covariance stationarity - Mean and variance constant and covariance depends only on lag(weak stationarity of order 2)
  • Trend stationarity - Obs oscillate randomly around a trend line a+bt that’s a function of only time
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the Dicky-Fuller test

A
  • Used to distinguish between trend and difference stationarity
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Describe a AR(p) process

A
  • AR(p) is a process where each obs is a linear combination of p previous values plus random error term
  • In the ACF plot, you would see a gradual decrease in autocorrelation as the lag increases, indicating a long memory in the time series. (tails off)
  • In the PACF plot, you would see significant values only at lags up to the order of the autoregressive model (p), beyond which it drops to zero or becomes insignificant. (cuts off)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Describe a MA(q) process

A

Definition
* MA(q) where each observation is linear combination of q previous error terms plus a current random error term
* n the ACF plot, you would see significant values only at lags up to the order of the moving average model (q), beyond which it drops to zero or becomes insignificant. (cuts off)
* In the PACF plot, you would see a gradual decrease in partial autocorrelation as the lag increases, indicating a long memory in the residuals. (tails off)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Describe a ARMA(p,q) process

A
  • ARMA(p,q), observation is sum of AR(p) and MA(q)
  • In both the ACF and PACF plots, you would see a gradual decrease in autocorrelation and partial autocorrelation as the lag increases, indicating a long memory in the time series and residuals. (tails off)
  • There may be kinks or changes in behaviour at lags corresponding to the order of the autoregressive (p) and moving average (q) components of the model.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Describe a ARIMA(p,d,d) process

A
  • ARIMA(p,d,q) one where d’th difference is ARMA(p,q) process
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Explain how ARIMA models can be fitted

A

Correlogram – plot of ACF
Can plot PACF
Can observe behaviours (cutting off or decaying) of plots for various degrees of integration, d = 0,1,2… giving an indication as to type of ARIMA(p,d,q) model to fit
Can compare fit:
AIC
BIC
Likelihood ratio tests
Can test white noise features:
Use turning point / portmanteau&raquo_space; perform tests on calculated residuals ε ̂_t to check if they exhibit features of white noise
Serial correlation of residual can be tested using Durban-Watson stat
Can use model to predict after fitting

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Describe a ARCH process

A

(Autoregressive conditional heteroscedastic):
* Constructed so variance changes over time
* Volatility clustering: Large change in previous values of process is often followed by period of high volatility

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Describe a GARCH process

A

(Generalised autoregressive conditional heteroscedastic):
* Constructed so that volatility depends on previous volatility and previous values of the process.
* Periods of high volatility usually last long time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly