Agresti chapter 3 Flashcards

1
Q

Regression line

A

Predicts the value for the response variable y as a straight-line function of the value x of the explanatory variable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Predicted value of y

A

y-hat

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Equation for regression (aka prediction equation) line

A

Y-hat = a + bx.
a denotes the Y-intercept
b denotes the slope.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Residuals

A

Prediction error. The actual value y - the predicted value y-hat. Absolute value is the distance between the point in the plot and the regression line

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Residual sum of squares (RSS)

A

Sum(residual)^2 = sum (y - y-hat)^2.

The better the line, the smaller the residuals and the smaller the RSS. The line is called the least squares method

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Slope formula

A

Slope = b = r(Sy / Sx). X-bar is the mean of x, Y-bar the mean of y, Sx the SD of x and Sy the SD of y

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Y-intercept formula

A

Y-intercept = a = y-bar - b(x-bar)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Influential observation

A

When the x value is relatively high or low compared to the rest of the data and lies far from the trend line

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Lurking variable

A

third variable that explains the correlation. Correlation does not imply causation. Different from confounding, which is associated both with the response variable and explanatory variable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly