L10 - Testing Linear Restrictions Flashcards

1
Q

Why do we need to impose restrictions onto a regression model?

A
  • in econometrics, we try to test whether economic theory hold for a specific data set or not. e.g. certain times or countries
  • The effects of these restrictions are measured by the decrease in the effectiveness of the restricted model in describing a set of data. In regression analysis the decrease in effectiveness is measured by the increase in the error sum of squares
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the general form of linear restriction?

A
  • where R = the identity matrix and r = which is the restricted matrix that we want tot test in this case all slope parameter are equal to zero
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is RRSS?

A

will just be the sum of squared deviations of a variable around its mean, as we dont need to account for anything else as they have been restricted

R2= 1 - RSS/TSS = 1 - URSS/RRSS

as TSS is the sum of squared around the mean and thats what we are call the RRSS so:

RRSS = URSS/(1-R2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is URSS?

A

This is simply RSS, as we dont restrict any values of the model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the general way to test linear restrictions?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is another way of writing the F statistic when testing linear restrictions?

A

F= ((RRSS-URSS)/r)/(URSS/N-k) ~ Fr,N-k

where r is the number of restrictions

Another way to write the F statistic is:

F=(R2/1-R2)*((N-k)/(k-1))

Note that k is the number of estimated parameters for the model which includes the constant or intercept.

Therefore k-1 is the number of slope coefficients for the model i.e. the number of X variables on the right-hand side of the equation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What are some issues in multivariate regression?

A
  1. Increasing the number of right-hand side variables will always increase the R-squared for the regression –> doesnt mean you are improving the model by including variables, R-squared can be misleading
  2. If the right-hand side variables are highly correlated with each other then the standard errors of the OLS coefficients will become large.
  3. If we omit relevant variables then the OLS estimates will be biased.
  4. If we include irrelevant variables the OLS estimates will be unbiased but inefficient. –> increasing R-squared but doesnt actually prove anyting
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is Adjusted R-squared?

A

Unless the variables added to the model are relevant, they will reduce the value of adjusted R-squared

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How can we define a regression model in mean deviation form?

A
  • it get rid of the intercept from the equation
  • the regression line will always pass through the sample mean of the data
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is Multicollinearity?

A
  • MultiCollinearity is an issue because it depends on the degrees of collinearity as the estimator is biased, or we might end up with an two or three estimators we only end up with one, as the regression model is not going to estimate more than one estimator is there is a perfect collinearity
    1. in the first step we are finding the inverse of the variance of a mean deviated regression form, it is the standard way finding the inverse of a 2x2 matrix where Δ = ad-bc –> the determinated
  • ρ is the coreelation coefficent of the two variables
  • perfect negative correlation is -1 and perfect positive correlation is 1
  • we are aiming for variance to be as small as possible so we do not want delta to be closer to 0, and the variables not to be closer correlated (without being complete irrelevant)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What are the usual signs of multicollinearity?

A

The usual signs of multicollinearity are:

  • For individual variables the standard errors are large and the t-ratios are low.
  • For the regression as a whole the F-statistic is highly significant.
  • The R2 will tend to be high –> e.g. 99.9% so nearly 100%
  • Some degree of multicollinearity is almost always present in econometric models. –> the aim is to reduce this issue as much as possible
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the Generalised Less Square Estimator?

A
  • the error terms u and v that is produced the the GLS is hetrostochastic
How well did you know this?
1
Not at all
2
3
4
5
Perfectly