Mid Term Flashcards

1
Q

The average of the OLS fitted values for any sample is always zero

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

When one includes an irrelevant independent variable in a regression, we call it “over controlling.”

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

If we were to change the units of measurement for one of the independent variables, the coefficient estimates for all independent variables would change.

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

If we run a regression in Stata and obtain a p-value equal to .06, we would reject the null hypothesis that the coefficient is equal to zero at the 5 percent level, but would fail to reject that null at the 10 percent level.

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Multicollinearity refers to the situation where the independent variables are highly correlated. Multicollinearity does not cause the OLS estimator to be biased, but it does generally increase the standard errors.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

If an estimator is consistent, then as the size of the random sample increases the estimator moves towards the true population parameter value.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

The zero conditional mean assumption implies that ui = 0 for all i regardless of the value of xi.

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

We do not need the normality of the error term assumption to perform valid statistical inference if the other multiple linear regression model assumptions hold and we have a large sample.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Omitting an independent variable that is correlated with the dependent variable from an OLS regression always causes the estimated coefficients on the included independent variables to be biased.

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

A confidence interval for a prediction is always at least as small or smaller than the corresponding prediction interval.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

One of the most important differences between an applied regression analysis course from the Statistics Department and an econometrics course from the Economics Department is the degree to which the course focuses on estimation bias caused by endogenous variables.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

The central limit theorem states that the sample mean of a variable, when it is standardized (by the population standard deviation), has a standard normal distribution, even if the variable itself is not normally distributed.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Under assumption SLR.1 - SLR.4, the OLS estimates equal the true population parameters.

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

A violation of the zero conditional mean assumption would cause the OLS estimator to be biased.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Over specifying a model (by adding irrelevant control variables) would cause the OLS estimator to be biased.

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

The sum of the squared residuals (SSR) is equal to the difference between the total sum of squares (SST) and the explained sum of squares (SSE).

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

A sample correlation coefficient of 0.95 between the regressor of interest and another regressor in the model would cause the OLS estimator to be biased.

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

The reason OLS is commonly used is because it is the most computationally efficient unbiased linear estimator.

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Omitting a variable that is correlated with the regressor of interest would cause the OLS estimator to be biased if the omitted variable is uncorrelated with the outcome variable.

A

False

20
Q

Imperfect multicollinearity causes OLS estimates to be biased.

A

False

21
Q

Multicollinearity increases the…

A

Variance of the estimator

22
Q

Heteroskedasticity causes the OLS estimator to be biased

A

False

23
Q

Heteroskedasticity causes the OLS estimator…

A

Standard Errors to be wrong

24
Q

Multicollinearity causes the OLS estimates to have large standard errors.

A

True

25
Q

If Stata reports a p-value equal to .04 then we would reject the null hypothesis that the coefficient is equal to zero at the 1 percent level, but would fail to reject that null at the 5 percent level.

A

False

26
Q

The p-value is the probability of the null hypothesis being true, given the observations.

A

False

27
Q

What is the probability of obtaining an estimate as extreme or more extreme than the one obtained if the null hypothesis were true.

A

P-Value

28
Q

If we run a regression using STATA and obtain a p-value equal to .06, then we would reject the null hypothesis that the coefficient is equal to zero at the 5 percent level, but would fail to reject that null at the 10 percent level.

A

False

29
Q

The t-statistic is the ratio of the parameter estimate and the variance of the parameter estimate.

A

False

30
Q

The standard error of a parameter estimate is also called the root mean squared error of the regression.

A

False

31
Q

If a parameter estimate is consistent, then as the size of the random sample increases to infinity the parameter estimate will move towards the true population parameter value.

A

True

32
Q

If an estimator of a population parameter is unbiased, then in repeated sampling from the population, the average value of the population parameter estimates will equal the true population parameter as the number of random samples goes to infinity.

A

True

33
Q

If the sample size is large, we can perform statistical inference even if assumption MLR.6 (Normality) is violated.

A

True

34
Q

Adding an additional independent variable to the regression cannot cause a decrease in the value of R squared.

A

True

35
Q

Adding an additional independent variable to the regression cannot cause a decrease in the value of Adjusted R-Squared

A

False

36
Q

Over controlling in a multiple regression model is when one includes an explanatory variable that is a pathway through which the independent variable of interest affects the dependent variable.

A

True

37
Q

The prediction interval is always wider than the confidence interval for the prediction.

A

True

38
Q

To model if there are increasing or decreasing returns to a particular independent variable one should include an interaction term.

A

False

39
Q

If the dependent variable is a binary variable, the error term is obviously not normally distributed. This may result in biased OLS estimates.

A

False

40
Q

Overspecifying the model causes bias.

A

False

41
Q

Overspecifying the model can increase the variance of the OLS estimator.

A

True

42
Q

Measures how many standard deviations beta j hat is away from the hypothesized value of beta j.

A

T statistic

43
Q

The correlation between the residuals from a regression and each of the X variables is positive.

A

False

44
Q

What is the percent change in one variable given a 1% ceteris paribus increase in another variable?

A

Elasticity

45
Q

What is, as I get more and more data, the standard error of beta j hat gets smaller and smaller at the rate of 1 over n squared

A

root n convergence