Section 4 Flashcards

1
Q

In a multiple regression model, how many explanatory variables are there and how many parameters to estimate are there?

A

K-1 explanatory variables

K parameters to estimate

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What does βj where j=2…k represent?

A

Each β (other than β1) represents PARTIAL slope coefficients

Tf β2 measures change in mean Y per unit change in X2 (Ceteris paribus)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the technique for minimising sum of squares in multiple regression analysis?

A

Find derivatives dS/dβi, set all equal to 0 and solve

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is assumption must be modified to ensure that multiple regression OLS estimators are still BLUE?

A

Assumption 4 must be modified: it now must be extended across all explanatory variables in the model, so EACH regressor is uncorrelated with the error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is assumption must be added to ensure that multiple regression OLS estimators are still BLUE?

A

No exact collinearity can exist between any of the variables (tf no exact linear relationship between any of the regressors)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Another name for exact collinearity?

A

Pure collinearity

Note: exact collinearity is very rare

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

How are the sampling distributions for the OLS estimators distributed?

A

Normally:

β(hat)j ~ N(βj,σ^2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How do we know the sampling distributions of the OLS estimators are unbiased?

A

The means equal the true (but Unknown) values

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the use of the R(bar)^2 statistic?

A

R^2 statistic is often used to compare models with the same dependent variable to see which model is better at explaining it. However, as more explanatory variables are added to a model, R^2 will naturally increase therefore can lead to incorrect conclusions about models. To penalise the use of the extra explanatory variables we use the R(bar)^2 statistic

The R(bar)^2 statistic will only increase if new variables ADD to the analysis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What test statistic, with how many degrees of freedom, would be used for a test involving one parameter? (Eg. βj=βj*) how would you do a test of significance?

A

T statistic, n-k DofF

Sub in βj*=0

See 4.4.1 to learn how to do it

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Give two examples of testing a linear restriction?

A

Testing if Σparameters=1

Or

Testing if β2=β3

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

How would you go about testing if β2=β3 (LINEAR RESTRICTION) via a hypothesis test? (H0? Distribution of it? Test statistic?)

A

H0: β2-β3=0
Distribution: β2-β3 ~N(β2-β3, σ^2β2 +σ^2β3 -2cov(β2β3))

SEE 4.4.2
(T statistic)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What test statistic, with how many degrees of freedom, would be used for the testing of joint restrictions? (Eg. β2=β4=0)

A

F statistic with q DofF for numerator and n-k for denominator, where q=number of restrictions(number of β involved) and n=sample size and k=number of β in total

See 4.4.3 and learn it!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

See more complex example for tests of joint restrictions in notes

A

Now

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Thing to remember when doing an F-test on joint restrictions?

A

Don’t worry about two/one tailed test, if it says 5% level of significance then use 5% graph

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is the test of overall significance(test of significance of regression) and what is the H0 and H1 for it?

A

The test that all k-1 slope parameters are jointly equal to 0

H0: β2=β3=…=βk=0
H1: at least one is not equal to 0

17
Q

What form has the restricted model for the test of overall significance?

A

Yi=β1+εi

18
Q

How do you then do the test of overall significance?

A

Same as the joint restrictions test with q=k-1 (doesn’t test significance of beta intercept)

19
Q

What is the alternative test to the test of overall significance?

A

Test:
H0: R^2=0
H1: R^2 not equal to 0

20
Q

How can you test a single restriction hypothesis using F-statistic? (And note)

A

Use same method and equation as normal F statistic test, with q=1

Note: will find F-value=(t-value)^2

21
Q

Define multicollinearity?

A

The situation where explanatory variables are highly correlated BUT not exactly

22
Q

What is a consequence of multicollinearity?

A

Assumption for BLUE is there should be no exact MC.
We are considering ‘close’ but not exact MC, tf doesn’t strictly violate assumptions BUT there are still consequences for OLS estimators since although they still may be the ‘best’ doesn’t necessarily mean they are good (see page 14)

23
Q

5 signs of multicollinearity?

A

1) parameter estimates have large σ^2
2) may find very small t statistics
3) despite insignificant variables in model, R^2 may still be high (contradictions t vs R^2)
4) coefficient estimates may have signs contrary to what theory predicts
5) estimators may be highly sensitive (small Δdata values-> largeΔOLS estimates)

24
Q

Best 3 ways to detect multicollinearity?

A

1) compare R^2 with t stats
2) compare correlation coefficients between explanatory variables
3) run additional regressions between explanatory variables then look at R^2 (see page 15)

25
Q

4 ways to deal with multicollinearity?

A

1) remove problem variable(s)
2) changing/extending sample data
3) changing functional form of model
4) use previous studies to get some β values then use model to calculate other β values

26
Q

Problem with removing variable(s) to deal with MC?

A

Could lead to misunderstanding of model

27
Q

Problem with using other study’s β values to deal with MC?

A

Their data may be inaccurate/different to yours

28
Q

SEE alternative function forms in notes

A

Now

29
Q

What is a log linear and log semi-linear model? What is a special feature of a log linear model?

A

Log linear model - sides in βln(X) or ln(Y) form
Parameters can be interpreted as elasticities!

Semi log linear - only left side (ln(Y)) is in log form

30
Q

Explain what the difference is between doing a joint test of restrictions and doing several individual ones?

A

Joint test: tests whether both beta 1 and beta 2 are 0

Individual tests: tests whether beta 2 is 0 when beta 4 is free to be whatever it is estimated to be and vice versa

31
Q

Explain how doing a joint test might lead to a different conclusion to doing 2 individual ones?

A

If two regressors beta 1 and beta 2 are highly correlated with each other (multicollinearity), then when doing a t test on X1 when X2 is already in the model, X1 is likely to appear insignificant since its ADDITIONAL explanatory power is negligible, and vice versa. Therefore for a t test they may both come out as insignificant and hence be excluded from the model

32
Q

For a complex example of joint restriction testing, what do you have to remember?

A

When rearranging the model, the final form of the rearrangement should only have terms with betas or errors, if any variables with neither in front they must be moved to the left to create a new variable Z (see notes example)
Also some X variables may be attached to the same beta, these must be replaced with a new variable W