The General Linear Model Flashcards

1
Q

Outline the General Linear Model

A

The General Linear Model is of the form y = Xβ + ε where ε ~ N(0, , σ^2I) and , σ^2 > 0 is usually unknown

It is a GLM with normal response y ~ N( µ, σ^2) where µ =x’β and thus with identity link g(µ) = µ

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What happens to the IRLS if we fit the General Linear Model as a GLM

A

The matrix of weights W does not depend on β as g’(µ) = 1 and V(µ) = 1 so the IRLS becomes β = (X’X)^-1 X’ y

Hence it converges after one iteration giving ^β = (X’X)^-1 X’ y with E[ ^β ] = β and Var{^β} = σ^2(X’X)^-1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Describe the residuals for The General Linear Model

A

Our residuals are observed - fitted so e = y - ^y = (I-H)y

With E [e] = 0 and Var {e} = σ^2 (I-H)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

How can we relate residuals, σ^2 and deviance?

A

^σ^2 = 1/n y’(I-H)y is our MLE for σ^2 which is different to ^σ^2 = (1/n-p) y’ (I-H)y

Residual sum of squares is e’e = SSE so scaled deviance will be SSE/σ^2 = D*

NB: SSE sum of squares errors is also sum of e^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Outline how to test whether parameters are equal to 0 Vs not necessarily zero.

A

We use H_o : β_1 = … = β_q = 0 Vs H_1 :β_1, … β_q not necessarily zero.

This is tested against the F test statistic for F = (D_o - D_1)/q / D_1 (n-p)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is total variation and how is it composed? Also give the coefficient of determination

A

SSR is the sum of squared residuals, SSE is the sum of squared errors, thus our total variation will be SST = SSE + SSR

The coefficient of determination is R^2 = SSR/SST with 9

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Describe one way analysis of variance (ANOVA)

A

Suppose we have a factor for an experimental para (temp) with J levels (10 C, 20 C…) then in our model we will get y_ik = µ + α_j + ε_jk as usual but where α is something extra to the J level

We test H_o = α_1, … , α_j = 0 no effect Vs H_1: α_i not 0 for some i.

Tested by F ~ F_(J-1), (n-J) where F = [D_o - D_1 / J-1] / [D_1/ n-J]. This tends to be large if H_o is not true.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Outline a Two-Way test of Variance

A

We have a model in the form y_ijk = µ + α_i + β_j + γ_ij + ε_ijk where α and β are the main effects of factors A and B and γ is the interaction between them

We can test four different hypothesises

  • H_o : all the α, β, γ are zero Vs H_1 : Maybe not
  • For the other three, test if each one if zero respectively

Our F value will be found from MS(.)/ MSE then tested against F distribution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How do we test for Covariance?

A

te

How well did you know this?
1
Not at all
2
3
4
5
Perfectly