214 (2) lecture 3 Flashcards

(30 cards)

1
Q

what is multiple regression?

A

Extends simple linear regression by allowing you to look at multiple predictors (IVs)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What does multiple regression show?

A

The relative importance of predictors and if an outcome (DV) is best predicted by a combination of variables

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What does multiple regression allow you to do?

A

Control for variables when testing the predictive power of variables

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the equation for multiple regression?

A

Y= A + B1 X1 + B2 X2 …

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What does multiple regression do?

A
  • Predicts outcome (Y) from multiple predictor variables. (X1 X2)
  • determines degree influence (i.e. weight) each predictor has in determining outcome
  • partial regression weight = the relation between Y and X after partialling out the relation of X with all other predictors
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is A (intercept) referred to as?

A

B0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What does multiple regression calculate?

A

The best regression weights with which to multiply the predictors to produce the best prediction of the outcome

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What data properties does MR accommodate?

A

Variables measured on different scales

  • ratio (e.g. age, test scores)
  • interval (e.g. Likert scale)
  • nominal (e.g. biological sex)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What assumptions does the data have to meet for MR?

A
  • Normal distribution
  • Linear relationship between variables
  • Homoscedasticity
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is homoscedasticity?

A

Variance of errors is the same across all levels of the IV. (i.e., criterion has an equal level of variability for each value of the predictor)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How do you check for homoscedasticity?

A

By visual examination of the standardized residuals (errors)
Looking for an even spread on either side of the line

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is multicollinearity?

A
  • when predictor variables correlate strongly with each other (e.g. .80+)
  • only one variable is assigned ‘predictive’ value
  • cancel each other out
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

How do you test for multicolinearity?

A

Variance inflation factor (VIF)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is variance inflation factor?

A

Shows how much variance of a regression coefficient is increased because of collinearity
- guidelines on acceptable VIF mixed, generally ~ 8 to 10.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What does the R and adjusted R square show in MR?

A

The amount of variance in criterion by predictors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is R (2): effect size in MR?

A
  • How well the predicted regression line approximates the actual data points (e.g., R (2) = 1.00 suggests regression line perfectly fits the data)
  • Amount of variance in criterion explained by regression equation within current sample.
17
Q

What is AR(2) in MR?

A
  • Amount of variance in the criterion that the equation explains in the population (note, R2 is calculated from the sample and is generally higher)
  • Adjusts R2 down with every IV added to the model
18
Q

What is standardized beta?

A

Beta scored derived from z scores

19
Q

What do standardized beta scores produce?

A

A value with no intercept

20
Q

Is the standardized beta equivalent to correlation with multiple predictors?

A

No

It is equivalent with 1 predictor but not with multiple

21
Q

How do you conduct multiple regression in SPSS?

A
Analyze
Regression 
Linear 
Outcome in DV 
Predictors in IV
22
Q

What are the different approaches in MR?

A

Enter
Heirarchical
Stepwise
Backwards

23
Q

What is the enter approach?

A

All predictors entered together

24
Q

What is the heirarchical approach?

A

Enter variables in ‘blocks’ based on theoretical or practical basis

The amount of variance in the DV accounted for by each block shown in R2/AR2 values

e.g. age might be entered in the equation first to control for it’s effects on the DV before entering other predictor variables

25
What is the stepwise approach?
Best predictor entered first, followed by second best...
26
What is the backwards approach?
All predictors entered and then removed until prediction gets worse.
27
When is heirarchical regression uesful?
when want to control the effects of some variables (e.g. demographic) on criterion.
28
How do you SPSS hierarchical regression?
``` Analyze Regression Linear Enter variable of control interest (age) into DV Remaining IVs into Independent OK ```
29
What does the SPSS output tell you in heirarchical regression?
Model 1 - shows the variance in the DV explained by only the variable you controlled for (e.g. Age) Model 2- shows the variance in the DV explained by all the other IVs
30
What is the preferred approach for MR?
The enter method