Topic 13 Flashcards

(9 cards)

1
Q

Multiple Regression

A

Linear regression with more than 2 variables (eg. book- weight, volume, paper vs hardback)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Factor

A

Categorical variable predictor

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Reference Level

A

Levels are the different cateogorical values the factor can take, reference level coded as 0 other levels coded as 1 to compare

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Collinear Explanatory Variables

A

Two predictor variables that are correlated, complicates the model estimation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Adjusted R^2

A

Accounts for the number of explanatory variables, applying a penalty for the number of preditors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Feature Seletction: Backwards Elimination- Adjusted R^2 Approach

A
  • Start with the full model
  • Drop one variable at a time and record adjusted R2 of each smaller model
  • Pick the model with the highest increase in adjusted R2
  • Repeat until none of the smaller models yield an increase in adjusted R2
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Feature Selection: Backwards Elimination- p-value Approach

A
  • Start with the full model
  • Drop the variable with the highest p-value and refit a smaller model
  • Repeat until all variables left in the model are significant
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Feature Selection: AIC Approach

A

*
Similar to the adjusted R2 approach: penalises for adding more variables to the model.
*
AIC quantifies the amount of information loss due to simplification of the model (less information loss is better)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly