Week 4 Flashcards

Multiple regression (14 cards)

1
Q

what allows us to explore the impact of a number of predictor variables on one outcome variable?

A

A multiple regression

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Can a multiple regression include as many predictor variables as you want?

A

Yes as long as you have a theoretical reason for doing so

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Which of these is not a theory-led type of regression analysis:
Forced entry
simple linear regression
hierarchical multiple regression
stepwise multiple regression

A

Stepwise multiple regression (we don’t cover this on the course)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the name of this type of multiple regression?
All variables are forced into the model at the same time - do not state a particular order for the variables to be entered

A

Forced entry, or, the Enter method in SPSS

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the name of this type of multiple regression?
Researcher decides the order in which the predictors are entered into the model. Enter known predictors (based on research) first and then enter new predictors (exploratory hypotheses)

A

Hierarchical regression

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the name of this type of multiple regression?
Both forward and backward methods - the most controversial method. Computer programme selects the predictor that best predicts the outcome and enters it into the model first.

A

Stepwise methods

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Parts of a regression

A
  • The regression line (the model, line of best fit)
  • Identify how well the model represents the data (is it significant? assessed using an ANOVA)
  • How much variance is accounted for by the model (effect size) R^2 value
  • examine the relationship between predictor and outcome (the intercepts, Betas (standardised and unstandardised, how does Y change in relation to a change in X))
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

(lack of) Multicollinearity - strong correlation between predictor variables.

A

VIF (Variance Inflation Factor) if the average VIF is substantially greater than 1 then regression may be biased. Greater than 10 and below 0.1 is a problem.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Independent errors - for any two observations (data points) the residual points should not correlate, they should be independent.

A

Test using Durbin-Watson test. Test statistic ranges from 0-4, a value of 2 means they are unrelated. (value greater than 2 means they are positively correlated, etc). value above 3 or below 1 is a definite problem, value close to 2 suggests no issue

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Normally distributed errors - there is an even chance of points lying above and below the best fit line

A

Visually examine histograms or Q-Q plots (looking for bell curve or points close to the line) or the Shapiro-Wilk test or Kolmogorov-Smirnov test can also be used, though they are sensitive to sample size and outliers,

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Regarding the assumptions of a multiple regression, what do we mean by ‘non-zero variance’?
a. Predictor variables should have a variance (SD) of zero.
b. Outcome variables should have a variance (SD) of zero
c. Predictor variables should have a variance (SD) of greater than zero.
d. Outcome variables should have a variance (SD) or greater than zero.

A

c. Predictor variables should have a variance (SD) of greater than zero.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

I a researcher wanted to conduct a multiple regression with 4 predictors, where they were aiming for a large effect, what would be the minimum sample required?
a. 39
b. 40
c. 36
d. 31

A

a. 39
power analysis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

The Durbin-Watson tests which of the following?
a. For multicollinearity
b. For independent errors
c. For the significance of the model
d.For non linearity

A

b. For independent errors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Which of the following tells us whether an individual predictor is significantly associated with the outcome variable?
a. The ANOVA result
b. The R2 value
c. The unstandardised Beta
d. The t-test results

A

d. The t-test results

How well did you know this?
1
Not at all
2
3
4
5
Perfectly