Week 4 - Classical Assumption Violations Flashcards

1
Q

Classical assumptions are unrealistic

What do we remove from CLRA (2)

A

IN TIME SERIES: We now say errors are autocorellated (Against CLRA4)

IN CROSS SECTION: We now find errors are heteroscedastic (σ² varies) (Against CLRA5)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

As well as autocorrelated errors, we should consider autocorrelated variables

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What does it mean if a variable is autocorrelated

(in time series models)

A

It is correlated with itself at different points in time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Autocorrelated variables expression:

A

Xt= pXt-1 + εt

Variable X at time t is a function of itself in the last period.

P is a parameter called the autocorrelation coefficient.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Example of a regression model of current variables affecting future dependent variable

A

Yt+1 = β₀+β₁Xt+₁ +β₂Zt +εt+₁

Here we can see Zt (current Z) influences Yt+1 (future Y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What happens when we include lags of variables in our models (2)

A

We lose observations so sample size N shrinks. (Smaller sample=less accurate)

More lags = more parameters (β) to estimate. so k increases too which lowers degrees of freedom n-(k+1), making less precise.

So both lower accuracy!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Covariance formula for:
Xt and its first lag Xt-1 expression
(Pg3 blue highlighter)

B) What happens to correlation as j gets bigger?

A

pσ² of x= pσ² of ε/ (1-p²)
Same as variance of current Xt but add p to σ².

For looking at cov (Xt,Xt-j), would be p to the j (the amount of lags) in the formula (above it would be p¹ since only looking at Xt and first lag Xt-1!)

B)
As j gets larger, further in past, and so correlation gets smaller, which makes sense!

(as results are likely to correlate with its near past rather than ages ago)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What do plots look like for different values of p
(p= autocorrelation coefficient) (pg4)

A

p=0 plot is random

p=0.9 strong pos, so smoother runs of positive and negative values

p=-0.9 strong neg auto, spikier plot

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

This was autocorrelation of variables, which does not violate the classical assumptions. What does violate the classical assumption

A

Autocorrelated ERRORS

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

So errors are autocorrelated (they are correlated with itself in different time periods)

Expression

A

εt = pεt-₁ + ut

p is autocorrelation coefficient
ut is error term assumed to satisfy the classical assumptions (important)

(Remember autocorrelated variable expression is
Xt = pXt-₁ + εt)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

So using this expression, when are classical assumptions violated?

A

Classical assumption is violated unless p=0

(Since if p=0 it only leaves Ut which is an error term that does satisfy the terms)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Sources of error autocorrelation (2)

A

Omission of explanatory variables - if omitted variable is autocorrelated, error term is autocorrelated. (Known as false autocorrel since not due to the error itself!)

Dynamic structure

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

So omitting autocorrelated variables can mean errors are autocorrelated.

Why may they be in the first place omitted (2)

A

Do not realise significance

Not measureable. E.g ability So therefore the error term reflects it instead

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

If ommitted, it is contained in the ε. What does this look like as an example.

A

True model
𝑌t=𝛽₀ +𝛽₁𝑋₁ + β₂𝑋₂ + 𝜔t where 𝜔 is the disturbance.

If say 𝑋₁ is unobservable then the
model we estimate is
𝑌 =𝛽₀+𝛽₂𝑋₂ +𝜀t

And error is soaked up the X₁ meansthat 𝜀 =𝛽₁𝑋 ₁+𝜔t

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

2nd source of autocorrelated errors:
Dynamic structure

A

Not possible to model every factor influencing the dependent variable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Consequences of autocorrelated errors, when using OLS as it is.

A

OLS estimators still unbiased. (TR1) E(βHAT₁)=β₁

Variances equations are incorrect
(Remember TR2, where we used cov=0 to remove all other covariances). Now covariance terms are present…

17
Q

Even if we use that adjusted variance, it is still not the best estimator as it does not have the smallest variance.

What is the best

A

Generalised least squares (GLS)