Chapter 7 - Intro to Linear Regression Flashcards

1
Q

Simple Linear Regression

A

Explain the variation in a dependent variable in terms of the variation of a single independent variable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Variation in Y (linear regression)

A

sum of all (Yi - Ybar)^2 . Meaning, Variation in Y = the sum of all (variable Y - Avg Y)^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Dependent variable

A

Y. Variation is explained by independent or X. Also known as explained, endogenous, predicted.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Independent Variable

A

Variable that explains the variation of Y, or dependent variable. Explanatory, exogenous, predicting.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Linear Regression Model: Yi=b0+b1Xi+Ei, i=1,…, n

A

Yi = ith observation of the dependent variable, Y. Xi = ith observation of the independent variable, X, b0 regression intercept, b1 = regression slope coefficient, ei = residual for the ith observation (disturbance or error)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

b0

A

regression intercept term

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

b1

A

regression slope term

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Ei

A

residual for the ith observation (disturbance / error).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

^Y i = ^b0 +^b1Xi, i=1,2,3,..n

A

Linear Equation/Regression line

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Regression line: ^Yi, ^b0, ^b1

A

Estimated value of Yi given Xi, estimated intercept term, estimated slope term.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Sum of Squared Errors SSE

A

Sum of squared vertical distances between actual Y values and predicted Y values regression line minimizes this

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

^b1

A

Slope coefficient; Change in Y for unit 1-change in X. COVofXY/σ^2ofx

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

^b0

A

=y ̅-^b1x ̅. Intercept - estimate of dependent variable, when independent (X) variable is zero.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Linear Regression Assumptions

A
  1. A linear relationship exists between the dependent and independent variables. 2. The variance of the residual term (e/error) is constant for all observations (homoskedasticity). 3. The residual term is independently distributed; that is, the residual for one observation is not correlated with that of another observation. 4. The residual term is normally distributed.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Homoskedasticity

A

Prediction errors all have same variance.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Heteroskedasticity

A

Assumption of homoskedasticity is violated.

17
Q

ANOVA analysis of variance

A

analyzes the total variability of the dependent variable

18
Q

Total sum of squares SST

A

∑_(i=1)^n (yi-y ̅ )^2 . Measures the total variation in the dependent variable. Equal to the sum of squared differences between actual Y value and mean of Y.

19
Q

Sum of squares regression (SSR)

A

Measures variation in dependent variable explained by INDEPENDENT variable. Sum of squared differences between PREDICTED Y and mean of Y. Sum of (^Yi -y ̅ )^2.

20
Q

Sum of squared errors (SSE)

A

Measures unexplained variation in dependent variable. SSE = SST- SSR. Sum of squared vertical distances between actual Y and PREDICTED Y. sum of (Yi-^Y)^2.

21
Q

MSR (Mean sum of square, SSR)

A

SSR/1 = SSR. MSR is SSR.

22
Q

MSE (Mean sum of square, SSE)

A

SSE/n-2.

23
Q

SEE

A

square root of MSE. Standard deviation of residuals. Lower SEE, better model fit.

24
Q

R^2

A

Coefficient of determination. % of total variation in dependent variable explained by independent variable. R^2 = SSR/SST.

25
Q

F test

A

MSR/MSE. One tailed test. How well a set of independent variables, explain variation in dependent variable. Reject H0 if F>Fc.

26
Q

Predicted Value ^Y

A

^Y = ^b0 +^b1Xp

27
Q

Log - lin model

A

Dependent variable is log while independent is linear

28
Q

Lin- log model

A

Dependent variable is linear while independent is logarithmic. Absolute change in dependent for relative change in independent.

29
Q

Log-log

A

Dependent variable and independent variable are log. Relative change in both.