week 8 Flashcards

1
Q

Correlations

A

whether two variables change together or covary
The value of correlation coefficient can vary from -1 to +1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Regression

A

uses correlation to predict values of one variable from another
The prediction is done by finding a regression line that best represents the data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

X axis

A

predictor

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Y axis

A

outcome

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

regression equation

A

Y=bo+b1x+e

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Y

A

outcome

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

bo

A

intercept

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

b1

A

slope of the line

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

X

A

predictor

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

e

A

error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Intercept

A

The point at which the regression line crosses the Y-axis
The value of Yi when X=0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Slope

A

a measure of how much Y changes as X changes
Regardless of it’s sign the larger the value of b1, the steeper the slope
For 1 unit of change on the X axis, how much change is there on the Y axis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Residual or prediction error

A

the difference between the observed value of the outcome variable and what the model predicts

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Lines of best fit

A

a line that best represents the data
a line that minimises residuals

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

residual sum of squares

A

residuals can be positive or negative. If we add the residuals, the positive ones will cancel out the negative ones, so we square them before we add them up. We refer to this total as the sum of squared residuals or residual sum of squares
SSr is a gauge of how well the model fits the data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Total sum squares

A

Using the sample mean of observed Y as a baseline model assuming no relationship Y and X
The sum of squared differences between the Yobs and the sample mean total sum squares

17
Q

Model sum of squares

A

sum of squared differences between the Yobs and the sample mean. It represents the improvement from the baseline model to the regression model

18
Q

SSt

A

total variance in the outcome variable can be partitioned into two parts

19
Q

SSm

A

variance explained by the model more variance

20
Q

SSr

A

variance not experienced by the model residual or error variance

21
Q

R2

A

this provides the proportion of variance accounted for by the model

22
Q

F ratio

A

f is the ratio of the explained variance to the unexplained variance

23
Q

Overall test

A

the hypothesis in regression can be phrased in various ways
can the scores on x and the regression line
does the model explain significant amount of variance in the outcome variable

24
Q

Unstandardised beta

A

the value of the slope b1
for every one unit change in x, the change in the value of Y. In units of measurement. Important to look at whether b1 is positive or negative
If b1 is 0 there is no relationship between X and Y