GLS Flashcards
(12 cards)
If εi is spherical, then it is …, and use …
homoskedastic and serially uncorrelated, then use the Standard Linear Model (SLM)
If εi is nonspherical, then it is …, and use …
heteroskedastic and autocorrelated, then use the Generalized Linear Model (GLM)
GLM assumptions:
1: y=Xβ+ε (Linearity)
2: E(ε|X)=0 (exogeneity, regressors contain no information on the derivation of Yi from its conditional expectation)
3: Var(εi|X)=σ^2ωi (heteroskedaticity), Note: when E(εi|X)=0, the variance-covariance matrix is Var(ε|X)=σ^2Ω
4: rank(X)=rank(X’)=rank(XX’)=rank(X’X)=k
5: ε|X~N(0,σ^2In)
6: {(Yi,Xi):i=1,…,n} are independent and identically distributed
Properties of βOLS^ in GLM:
1: β^ still unbiased if E(ε|X)=0
2: β^ is not the best unbiased estimator, it is not asymptotically efficient
3: β^ is multivariate normal
4: β^ is consistent
5: β^ is asymptotically normal
Heteroskedasticity-consistent estimator (HCE).
HCE=(X’X)^-1Σ(εi^)^2xixi’(X’X)^-1
Interpretation:
Under homoskedasticity, all elements are the same across the diagonal
Under heteroskedasticity, all elements are different across the diagonal
When do we use the Generalized Least Squares (GLS) estimator?
When there is correlation between the residuals (εi is heteroskedastic, but not autocorrelated)
GLS estimator form:
βGLS^=(X’P’PX)^-1X’P’Py
If Ω^-1 is known, βGLS^=(X’Ω^-1X)^-1X’Ω^-1y
Properties of GLS estimator:
1: GLS is unbiased if E(ε|X)=0
2: GLS is the best unbiased estimator
3: GLS is consistent
4: GLS is normal
5: GLS is asymptotically normal
When do we construct βFGLS^ and what is its form? (Feasible GLS)
If Ω depends on unknown parameters, we cannot compute βGLS^. However, Ω depends on a small set of unknown parameters, we can estimate them, construct Ω^:
βFGLS^=(X’(Ω^)^-1X)^-1X’(Ω^)^-1y
Properties of FGLS:
1: FGLS is consistent
2: FGLS is asymptotically efficient and normal, under regularity conditions
When do we use the White test?
When we have heteroskedasticity of unknown form: H0: σi^2=σ^2 vs. H1: σi^2≠σ^2 (homoskedastic vs heteroskedastic)
How to perform the White test?
1: Compute β^ and ε^
2: square each term of vector ε^ and make this the dependent variable
3: Carry out an OLS regression of εi^2 on X. If xi contains a dummy, the square of that variable is also a dummy, so do not add this in the test.
4: Compute LM =nR^2~χ_(J-1)^2, where J is the number of regressors including the column of ones. Reject H0 if LM>χ_(J-1)^2
5: (Instead of step 4) Carry out a model F-test