penis balls Flashcards

OLS (51 cards)

1
Q

Why use Ordinary Least Squares?

A
  1. ) OLS is relatively easy to use.
  2. ) The goal of minimizing sum of error squared is quite appropriate from a theoretical point of view
  3. ) OLS estimates have a number of useful characteristics
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Ordinary Least Squares (OLS)

A

is a regression estimation technique that calculates the estimated slope coefficients so as to minimize the sum of the squared residuals

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

In a simple linear regression model, the standard error of the slope coefficient is expected to shrink at a rate that is equal to the inverse of the:

A) Square root of the number of parameters in the model
B) Square of the sample size
C) The sample size minus the number of parameters in the model
D) Square root of the sample size

A

d) Square root of the sample size

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

In a multiple linear regression model, the OLS estimates are inconsistent if:

A) There is correlation between the dependent variables and the error term
B) There is correlation between the independent variables and the error term
C) There is correlation between the independent variables
D) The sample size is less than the number of parameters in the model

A

b) There is correlation between the independent variables and the error term

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

If Betaj^ is an unbiased and consistent estimator of Betaj, then which of the following statements is correct?

A) The distribution of Betaj^ becomes more and more spread around the Betaj as the sample size grows
B) The distribution of Betaj^ collapses to a single point Betaj when the sample size tends to infinity
C) The distribution of Betaj^ tends toward a standard normal distribution as the sample size grows
D) None of the other statements are correct

A

b) The distribution of Betaj^ collapses to a single point Betaj when the sample size tends to infinity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

In a multiple linear regression model, if the variance of the error term conditional on an explanatory variable is not constant then:

A) The t statistics are invalid and confidence intervals are valid for small sample sizes
B) The t statistics and confidence intervals are invalid no matter how large the sample size is
C) The t statistics are valid and confidence intervals are invalid for small sample sizes
D) The t statistics and confidence intervals are valid no matter how large the sample size is

A

b) The t statistics and confidence intervals are invalid no matter how large the sample size is

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

In a simple linear regression model, a change in the dependent variable’s unit of measurement does not lead to a change in:

A) The confidence intervals of the regression
B) The sum of squared residuals of the regression
C) The goodness-of-fit of the regression
D) The standard error of the regression

A

C) The goodness-of-fit of the regression

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Measurement error occurs in a simple linear regression model when:

A) The observed value of a variable used in the model differs from its actual value
B) The model includes more than two independent variables
C) The partial effect of an independent variable depends on unobserved factors
D) The dependent variable is binary, but the independent variable is continuous

A

A) The observed value of a variable used in the model differs from its actual value

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

In a hypothesis test, the significance level used is:

A) The probability of rejecting the null hypothesis when it is true
B) One minus the probability of rejecting the null hypothesis when it is true
C) One minus the probability of rejecting the null hypothesis when it is false
D) The probability of accepting the null hypothesis when it is true

A

A) The probability of rejecting the null hypothesis when it is true

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

The F statistics can be used to test non-nested models

True
False

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Predictions made for the dependent variable in a multiple linear regression model are subject to sampling variation

True
False

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

In a multiple regression model with an independent variable that is a dummy variable, the coefficient on the dummy variable for a particular group represents the estimated difference in intercepts between that group and the base group.

True
False

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

In a multiple linear regression where the Gauss-Markov assumptions hold, why can you interpret each coefficient as a ceteris paribus effect?

A

Because the Ordinary Least Squares (OLS) estimator of the coefficient on variable xj is based on the covariance between the dependent variable and the variable xj after the effects of other regressors has been removed.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

In a random sample:

A

All the individuals or units from the population have the same probability of being chosen.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What assumption is necessarily violated if the weekly endowment of time (168 hours) is entirely spent either studying, or sleeping, or working, or in leisure activities?

A

No perfect multicollinearity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Take an observed (that is, estimated) 95% confidence interval for a parameter of a multiple linear regression. Then:

A

We cannot assign a probability to the event that the true parameter value lies inside that interval.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

In testing multiple exclusion restrictions in the multiple regression model under the classical assumptions, we are more likely to reject the null that some coefficients are zero if:

A

the R-squared of the unrestricted model is large relative to the R-squared of the restricted model.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

In the Chow test the null hypothesis is:

A

all the coefficients in a regression model are the same in two separate populations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

The significance level of a test is:

A

one minus the probability of rejecting the null hypothesis when it is true.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

What is true of confidence intervals?

A

Confidence intervals are also called interval estimates.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Define the upper bound of a confidence interval for population parameter β

A

The upper bound of the confidence interval for a for a population parameter β is given by β + critical value · standard error β .

22
Q

Which of the following is true of the standard error of the OLS slope estimator, s.e. β ?

A

It is an estimate of the standard deviation of the OLS slope estimator.

23
Q

A restricted model will always have fewer parameters than unrestricted

True
False

24
Q

The F statistic is always nonnegative as SSRr is never smaller than SSRur.

True
False

25
The population parameter in the null hypothesis… A) Is always equal to 0 B) Never equal to 0 C) Is not always equal to 0
c) Is not always equal to 0
26
Changing the unit of measurement of any independent variables, where the log of the independent variable appears in the regression affects…
only the intercept coefficient.
27
A predicted value of a dependent variable...
represents the expected value of the dependent variable given particular values for the explanatory variables.
28
Which Gauss-Markov assumption is violated by the linear probability model?
The assumption of constant variance of the error term. (Heteroskedaticty)
29
The heteroskedasticity-robust t statistics are justified only if the sample size is large. True False
True
30
What does the regression slope coefficient indicate?
by how many units the conditional mean of y increases, given a one unit increase in x
31
beta 1 hat has a smaller standard error, ceteris paribus, if
there is more variation in the explanatory variable, x
32
What is BLUE?
the OLS estimator with the smallest variance in the class of linear unbiased estimators of the parameters
33
R-squared can never decrease when another independent variable is added to a regression True False
True
34
When there is high correlation, OLS estimators can still be unbiased, but the estimation of parameters has lower precision when regressors are correlated True False
True
35
Increasing the sample size A) Increases variance B) Keeps variance the same C) Reduces variance
C) Reduces variance
36
What is the p-value?
smallest significance at which the null would be rejected
37
The rejection rule in terms of the p-value is…
if p-value < alpha, we reject null
38
At what point can the null not be rejected?
When the significance level is reduced
39
a large p value is in favour of the null True False
True
40
the larger F is, the larger the SSR restricted relative to SSR unrestricted, so the worse the explanatory power of the restricted model. What does this say about the null H0?
This implies that the null H0 is false
41
the r squared of the restricted model is...
Zero by definition
42
the adjusted r squared takes into account the number of variables in a model and it may… A) Decrease B) Stay unchanged C) Increase
A) Decrease
43
What do you do if MLR3 is violated with a perfect linear function
drop one of the independent variables
44
if Fstat > Fcritical…
then x1 and x2 are jointly significant and reject the null
45
if there is heteroskedasticity
the ols is not the most efficient estimator and the standard errors are not valid for inference
46
The F statistic is always nonnegative because…
The SSRr is smaller than SSRur
47
changing the unit of measurement of any independent variable, where log of the independent variable appears in the regression affects A) Only the beta slope coefficient(s) B) Only the intercept coefficient C) All coefficients
B) Only the intercept coefficient
48
the assumption of constant variance of the error term is violated by the linear probability model True False
True
49
What are the assumptions of MLR? (1-5)
``` MLR1: Linear in parameters MLR2: Random sampling MLR3: No perfect collinearity MLR4: Zero conditional mean MLR4: Homoscedasticity ```
50
What does MLR1-4 ensure?
Unbiasedness of the OLS estimators
51
In a regression Y = beta0 + x1beta1 + x2beta2 + u, if x2 is omitted, which of the following are correct? A) When beta2 > 0 and corr(x1, x2) > 0, there is a positive bias B) When beta2 < 0 and corr(x1, x2) > 0, there is a negative bias C) When beta2 > 0 and corr(x1, x2) < 0, there is a positive bias D) When beta2 < 0 and corr(x1, x2) < 0, there is a negative bias E) A and B are correct F) All of the above are correct
E) A and B are correct