Term 2 H1 Time Series Flashcards

1
Q

What is the contemporaneous effect in time series

A

This is the derivative of dyt/xyt and is also known as the 0 day effect.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

When dealing with a time series data set with years from 2010-2020 how would this work?

A

2020-2010 = 10 add a 1 as this is inclusive so = 11.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How could you calculate the implied total effect that has the independent variable on the RHS

A

Made up of direct and indirect effect .

Direct is the coefficient with respect to that period
Indirect is the coefficient with respect to the independent y

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

How do you find the indirect effect?

A

Usually the coefficeint of yt+1 which is typically the coefficient from the period above multiplied by the one attached to y in this period.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is a condition for the indirect effect of a system?

A

The coefficient on beta must be less than 1 so the effect can eventually dissipate from the system.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What does the coefficient mean on the lagged dependent variable

A

The coefficient on lagged depdendent variable is the persistence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

How do you calculate the long run response of something, and what does this give you?

A
  1. All variables become X,Y etc in the long run.
  2. sub these in to original equation
  3. Take common terms and make y* the subject
  4. The coefficients are the the long term multipliers
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the issue with OLS estimates and time series models.

A

The lagged independent variable is correlated with the error term and

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

When you have time series serial correlation what is the way to make the model consistent again?

A

You add in a lagged variable for every variable in the model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How do you fix the model based on serial correlation in order to make OLS estimates consistent.

1.What is special about this??

A

You add a lagged variable for each variable for every extra error term where they are unique

  1. They must be unique
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

1.What is the ACF and what does it do?

2.What can the ACF be made from?

A

1.ACF (autocorrelation function)
Allows us to extract more information from time series model.

It is correlation p(zt,zt-j)

  1. Zt can be any series or residuals/ derived series.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the the formula for Auto correlation function (in simplest form).

How and why do you get to this simplest form?

A

gamma j = cov ( zt, zt-j)
gamma 0 = v(zt)

P(Zt,Zt-j) = gamma j / gamma 0

You get to this form due to the conditions of stationarity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

1.What does it mean if Zt is stationary?

2.What is the condition of weak stationarity

A

E(Zt) = mew for all time
V(Zt) = sigma squared for all time
Cov(zt,zt-h) = gamma h

  1. p(zt, zt-j) = pj tends to 0 as j gets bigger

p0= p(zt,zt) =1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

How do you interpret the Ar correlation rows

A

how much of the shock is remembers in period j

Graphical memory of the series.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is particular about the white noise process and how do you set it up?

A
  • The white noise process cannot be forecasted

Set up

Zt= epsilon t

E(epsilon t) = 0
V(epsilon t) = sigma squared
cov(epsilon t, epsilon t-j) = 0 j is not equaled to 0
epsilon t is - N(0, sigma squared)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What does the white noise test look like graphically?

A

100% of the shock is in period 0.

If you shock the system today 100% happens today and the rest dissipates from the system.

17
Q

What does Ar(1) look like if phi is less than 0?

A

Zig zaga decay from left to right all the way to zero.

17
Q

What does an AR(1) look like graphically if phi> 0

A

Smooth decay from top left to bottom right

18
Q

How does the Ar(1) mechanism work with a phi of 0.5

A

1st period remembers phi of the shock
2nd period remembers phi squared of the shock
keeps remembering this all the way to 0

19
Q

What is the relationship between phi and how much is remembered

A
  • the larger is phi the more persistence the shock will have and will take longer to dissipate.
20
Q
A
20
Q

How do you set up the Ar(1)

A

Zt= phi(zt-1) + epsilon tc v

21
Q

What is the trend in data?

A

Trend refers to the up or downs of data over time

22
Q

What is seasonality?

A

Seasonality = fluctuations in data with regular time periods

23
Q

What are dynamic models?

A

These are models that have sluggish variables with a time lag.

24
Q

How do you interpret the coefficients of dynamic models?

A

There is a direct and indirect effect.

for period yt/xt this is just beta 1

to find the next coefficient you roll forward to the next period and do yt+1/xt

this will give a direct effect for x and then a indirect effect.

25
Q

How do you work out the total change?

A

You add all the partial effects of all the periods needed.

25
Q

How do you work out the long term effect?

A

for the x variables you sub in x*
for the y variables you sub in y*

then you collect like terms and solve for y*
This gives the Long term multiplier

26
Q
A
27
Q

What are the usual CLRM assumptions
How do these change when you have a lagged dependent variable?

A

In CLRM we assume
1.-That epsilons are not correlated with x
However, this is not sensible as cov(yt
, εt) =/ 0
This changes to a weaker assumption to contemporaneous exogeneity (which implies the errors are uncorrelated with the past dependent variables).

This yields a biased estimator.

  1. we need to replace the assumption of random sampling as in time series the observations are related over time.

To address this we replace the assumption of random sampling with stationarity and weak dependence.

  1. E(yt) = µ (mean is constant at all points in time)
  2. V (yt) = σ^2
    (variance is constant at all points in time)
  3. cov(yt
    , yt−h) = γh (covariance only depend on h and not t)
28
Q

What does weak dependence require?

A

Weak Dependence
This requires:
cov(yt, yt−h) → 0 as h → ∞.

29
Q

What are the implications of using weak dependence and stationarity?

A
  1. Our OLS estimator is not unbiased, but is consistent (implying that as the number of
    observations, T, gets large our OLS estimator gets “close” to the true coefficient) (see
    Appendix A for a demonstration of this idea).
  2. The distribution of the OLS estimator is asymptotically normally distributed (implying
    that as the number of observations, T, gets large the distribution of the OLS estimator
    becomes like a normal distribution). This implies for hypothesis testing of a single
    restriction you should use the normal distribution NOT the t-distribution and for
    testing multiple restrictions you should use the χ
    2
    -distribution rather than the Fdistribution (see Appendix A for a demonstration of this idea).