Least Squares Intro Flashcards

1
Q

What are the names of the two variables involved in bivariate regression?

A

On the LHS y is the dependent variable or the regressand or the outcome or the explained variable
On the RHS x is the independent variable or the regressor or the explanatory variable or the covariate or the control

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is regression to the mean?

A

If the first observation of a variable was extreme, the next observation is likely to be less extreme, and vice versa

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the most basic relationship Y could have?

A

Yi = α + εi for i = 1, …, N
Here these is a constant unknown population parameter and random deviation from this parameter
E(Yi) = α and the estimator for α and Y are a and Ŷ respectively but Ŷi = a in this case

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the sum of squared residuals for Yi = α + εi?

A

SSR = Σi(Yi - a)2 = Σiei2
ei = Yi - Ŷi = Yi - a

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the least squares estimator for Yi = α + εi?

A

From the criterion function minaSSR we can find the solution using ∂SSR/∂a = Σi∂ei2/∂a which by the chain rule is Σi∂ei2/∂ei ∂ei/∂a = -2Σei
The FOC sets this to zero so 0 = ΣYi - Σa so a = 1/n ΣYi = Ȳ
Note that Ȳ is a function of random variable Y and so is also a random variable with a sampling variance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the iterated expectation?

A

Converting a conditional expectation to an expectation
EX(E(Y|X)) = E(Y)
For continuous X, this comes from ∫XE(Y|X=x)fx(x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is a least squares line?

A

An estimate of a linear relationship between two variables
Ŷ = a + bx where Ŷ is a predicted value paired with a particular x, a is the estimated intercept, and b is the estimated slope

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is one assumption imposed by an OLS estimate?

A

Equal variances

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How do you find the estimators for slope and intercept of a least squares line?

A

mina, b SSR
FOCs are ∂SSR/∂a = 0 and ∂SSR/∂b = 0 so for a this comes out to a = Ȳ - bX̄ (work through in your head), for b we have 0 = ΣXiei = ΣXY - ΣaX - ΣbX2 so ΣXY = Σ(Ȳ - bX̄)X + ΣbX2 which can be expanded and rearranged to get ΣXY - ȲΣX = b(ΣX - X̄ΣX) and ΣX = nX̄ so b = SXY/SX2 (sample covariance over sample variance)
a and b are random variables with expectation equal to the population parameters

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How can the slope estimator be found from Cov(Y, X)?

A

Cov((α + βX + ε), X) = Cov(α, X) + βCov(X, X) + Cov(X, ε) but last term assumed to be zero so β = Cov(X, Y)/Var(X) and the estimator b = 1/(n-1)Σ(Y - Ȳ)(X - X̄) / 1/(n-1)Σ(X - X̄)2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What are the population and sample conditional expectation functions from OLS?

A

E(Y|X) = α + βX and E(Ŷ|X) = a + bX

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What relationship between Y and X does a linear model impose?

A

Constant ∂Ŷ/∂Xi
Not always optimal

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the relationship between the slope estimator and the sample correlation coefficient?

A

b = rSY/SX

How well did you know this?
1
Not at all
2
3
4
5
Perfectly