Flashcards in Lecture 11 (Other) Deck (12):

1

## Be able to interpret the SPSS output from a simple linear regression (lookup...if you can figure a good way to put an answer for this, go for it!)

### you heard me...

2

## What is the difference between a simple linear regression and multiple linear regression?

### Simple has one predictor variable and multiple has multiple predictor variables. Multiple linear has multiple IVs (predictors).

3

## Be able to interpret the regression coefficient in terms of the relationship between X and Y

### X predicts Y. It is a causal relationship between variables. For every one unit increase in X, ___amount of change occurs in Y.

4

## If the regression coefficient is zero, what does the resulting regression line look like? What does the predicted Y value (or Y') equal, or what is our best estimate of the predicted value of Y?

### The line would be flat horizontally. The stronger the relationship, the steeper the slope. The best guess would be the mean from the sample, if the slope was 0. Y intercept is the estimated value of the DV when the IV is 0.

5

## In simple linear regression, how are the bivariate correlation coefficient (r) and the regression coefficient (b) similar? How are they different?

### They are the two values obtained/reported in a simple regression analysis that provide an estimate as to the degree of relationship between a predictor and outcome. In other words, the reg. coefficient (r) is the relationship and the bivariate (b) coefficient is the prediction of the relationship for future use. The large the r, the larger the b.

6

##
Be able to determine the proportion of variance explained by the predictor (or IV) and the proportion of variance due to error, if given the sum of squares due to regression (SSreg)

###
Sum of squares due to regression = shared variance

7

## Be able to determine the proportion of variance explained by the predictor (or IV) and the proportion of variance due to error, if given the of squares due to residual error (SSres)

###
Sum of square due to residual error = noise outside the overlap. The difference between the actual score and the predicted score.

8

## Be able to determine the proportion of variance explained by the predictor (or IV) and the proportion of variance due to error, if given the of squares total (SStot)

###
Sum of squares total = variance for the DV

You are able to partition the sum of squares. You get the proportion by dividing the sum of squares total by the ss regression and the ss error.

9

## What do we get when we partition the sum of squares for the DV?

### We can get the proportion of shared variance and error. We can then calculate the coefficient of determination and then the correlation coefficient.

10

## Using the coefficient of determination, be able to identify how much of the explained variance is due to regression, and how much is attributed to error

### Squared r is the percent of shared variance. X% of variance in “a” is due to its relationship to “b” and the rest is error. R squared is the proportion of shared variance. 1-shared variance is error.

11

## What is the relationship between the coefficient of determination and the sum of squares due to regression?

### R squared is the proportion of the sum of squares due to variance.

12