Regression Flashcards

1
Q

What is regression analysis

A

Analysis allowing for adjustment of confounders

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Types of regression to know

A

linear - continuous numeric outcome which can be used for curves despite being called linear
logistic - binary outcome, transformation of outcome is modelled

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

simple vs multivariable regression

A

simple has one predictor so no accounting for confounders, multivariable adjusts for other factors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Linear regression coefficients

A

y is outcome
a shows position of line (doesn’t always have practical meaning), shows as (constant) on SPSS
b is regression coefficient, measures association between predictor and outcome
b1, b2 etc are partial regression coefficients for multiple regression coefficients for each variable, assuming all other confounders remain constant
P-values for all coefficients

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

dummy variables use

A

Used for categoric confounders
One less dummy variable than number of categories
Set one as reference category
Dummy variable ‘b’ values are compared to reference variable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Archaic/Bayes information criterion

A

AIC/BIC, lower the better

Measures how well the model fits the data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Residuals

A

Differences between observed and predicted values for model

Minimising residuals overall finds line of best fit

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Linear regression assumptions

A

continuous outcome
Predictor variables not dependent on each other - multicollinearity
Residuals normally distributed around 0
Residuals have constant variance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Multicollinearity measure

A

Variance inflation factor (VIF), e.g. 10 means 90% of variability explained by other predictors
Lower is better, >5 should be investigated

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Homoscedasticity

A

Constant variance of residuals for all values of predictors and outcome, one of the necessary assumptions for linear regression

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Transformation for logistic regression

A

Binary outcomes interpreted as odds then log(odds) used as this is linear, spss fits log(odds) in same way as linear regression

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Interpreting logistic regression outcome

A
exponential of (b coefficient (in binary variables) x category value) gives to back transform, gives odds of outcome
The odds ratio tells you how the odds of the outcome
changes for every one unit increase in the predictor variable for numeric variable confounders
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Logistic regression assumptions

A
No multicollinearity (VIF can be used to check)
Binary variable
No need to check residuals, don't have to have normal distribution
How well did you know this?
1
Not at all
2
3
4
5
Perfectly