L9 - The Regression Model in Matrix Form Flashcards

1
Q

What does the value of β give you in Multivariate regression?

A
  • the line means partial differentiation holding that variable constant
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

When is Multivariate regression equilvalent to Bivariate regression?

A
  • Based on purged data
  • So instead of running a multivariate regression model with 3 parameters, you can run a bivarate model 3 different times and get the same answer
  • This is proved by the Frisch-Waugh-Lovell theorem
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is Least Square Model of regression in matrix form?

A

y(Ubar)= X(Ubar)β +u

The Guass-Markov assumptions are:

E(u) = 0

E(uuT)=σu2IN

X is fixed in repeated samples

The OLS estimator is:

β(hat)=(X(UBar)TX(Ubar))-1X(Ubar)Ty(Ubar)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the prooft that the OLS is unbiased unfer the Gauss-Markov assumptions?

A
  • (LAST LINE) If we take the expectation of both sides you will see that the expect value of β(hat) equal the expected side of the other side of the equation
  • The reasons why β is not in the expectation function is because its constant and the expected value of a constant is itself
  • as from the GM assumptions we know the X values are fixed so we can take them out of the expectation operation so it is (X(Ubar)TX(Ubar))-1X(Ubar)T*E(u) but as E(u)=0 we get the final line proving the OLS is unbiased
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How do you calculate the variance of the OLS estimator in matrix form?

A
  • Line 2 –> shows both β(hat)-E(β(hat) and the transposed version
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is a variance-covariance matrix?

A
  • Variance is a measure of the variability or spread in a set of data. Mathematically, it is the average squared deviation from the mean score.
  • Covariance is a measure of the extent to which corresponding elements from two sets of ordered data move in the same direction.
  • Variance and covariance are often displayed together in a variance-covariance matrix, (aka, a covariance matrix). The variances appear along the diagonal and covariances appear in the off-diagonal elements
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the distribution of the OLS estimator in matrix form?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How do you find the sample-covariance matrix?

A
  • just like the bivariate model as we dont know the actual value of the variance we can only use the estimator of the variance
  • σu(hat)2–> is the residual sum of squares divided by its degrees of freedom
  • k is the number of parameters we estimate, or restrictions used
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How would you set up a hypothesis test of a multivariate regression model?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What does the Guass-Markov Theorem tell you about the variance-covariance matrix of any linear unbiased estimator?

A

The Gauss-Markov Theorem shows that, under the GM assumptions, the variance-covariance matrix of any linear unbiased estimator differs from the OLS v-cov matrix by a positive semi-definite matrix.

Hence OLS is BLUE.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly