214 (2) lecture 3 Flashcards
(30 cards)
what is multiple regression?
Extends simple linear regression by allowing you to look at multiple predictors (IVs)
What does multiple regression show?
The relative importance of predictors and if an outcome (DV) is best predicted by a combination of variables
What does multiple regression allow you to do?
Control for variables when testing the predictive power of variables
What is the equation for multiple regression?
Y= A + B1 X1 + B2 X2 …
What does multiple regression do?
- Predicts outcome (Y) from multiple predictor variables. (X1 X2)
- determines degree influence (i.e. weight) each predictor has in determining outcome
- partial regression weight = the relation between Y and X after partialling out the relation of X with all other predictors
What is A (intercept) referred to as?
B0
What does multiple regression calculate?
The best regression weights with which to multiply the predictors to produce the best prediction of the outcome
What data properties does MR accommodate?
Variables measured on different scales
- ratio (e.g. age, test scores)
- interval (e.g. Likert scale)
- nominal (e.g. biological sex)
What assumptions does the data have to meet for MR?
- Normal distribution
- Linear relationship between variables
- Homoscedasticity
What is homoscedasticity?
Variance of errors is the same across all levels of the IV. (i.e., criterion has an equal level of variability for each value of the predictor)
How do you check for homoscedasticity?
By visual examination of the standardized residuals (errors)
Looking for an even spread on either side of the line
What is multicollinearity?
- when predictor variables correlate strongly with each other (e.g. .80+)
- only one variable is assigned ‘predictive’ value
- cancel each other out
How do you test for multicolinearity?
Variance inflation factor (VIF)
What is variance inflation factor?
Shows how much variance of a regression coefficient is increased because of collinearity
- guidelines on acceptable VIF mixed, generally ~ 8 to 10.
What does the R and adjusted R square show in MR?
The amount of variance in criterion by predictors
What is R (2): effect size in MR?
- How well the predicted regression line approximates the actual data points (e.g., R (2) = 1.00 suggests regression line perfectly fits the data)
- Amount of variance in criterion explained by regression equation within current sample.
What is AR(2) in MR?
- Amount of variance in the criterion that the equation explains in the population (note, R2 is calculated from the sample and is generally higher)
- Adjusts R2 down with every IV added to the model
What is standardized beta?
Beta scored derived from z scores
What do standardized beta scores produce?
A value with no intercept
Is the standardized beta equivalent to correlation with multiple predictors?
No
It is equivalent with 1 predictor but not with multiple
How do you conduct multiple regression in SPSS?
Analyze Regression Linear Outcome in DV Predictors in IV
What are the different approaches in MR?
Enter
Heirarchical
Stepwise
Backwards
What is the enter approach?
All predictors entered together
What is the heirarchical approach?
Enter variables in ‘blocks’ based on theoretical or practical basis
The amount of variance in the DV accounted for by each block shown in R2/AR2 values
e.g. age might be entered in the equation first to control for it’s effects on the DV before entering other predictor variables