Model Complexity and Trade Offs Flashcards

(21 cards)

1
Q

Wow is model complexity increased?

A

By increasing the highest polynomial power of the model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the main characteristic of underfitting?

A

Training and cross validation error are both high.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What are the main characteristics of underfitting?

A

Training error is low but cross validation error is high.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is Bias?

A

Predictions are consistent but poor model choices lead to wrong predictions. The model does not capture the relationship between the features and the outcomes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is Variance?

A

The model identifies the relationship between features and outcomes but incorporates random noise beside the underlying function.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What are the three sources of model error?

A

Bias, Variance, and Irreducibility

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is irreducible error?

A

The error generated form real world data always containing some randomness.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What are the causes of model bias?

A

The model is misrepresenting the data given missing information, or the model is overly simple. Associated with Underfitting.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What are the causes of model variance?

A

The output being highly sensitive to changes in input data. This is often due to overly complex or poorly fitted models.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the bias-variance trade off?

A

Model adjustments that decrease bias often increase variance and so finding the best model means choosing the right level of complexity to minimise bias-variance.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is Linear Model Regularisation?

A

A method of shrinkage that adds an adjustable regularisation strength parameter directly into the cost function.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Why is Linear Model Regularisation used?

A

It allows us to manage the complexity tradeoff with more regularisation producing a simpler more biased model.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What are two approaches to regularisation?

A

Ridge Regression and LASSO

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

How does Ridge Regression work?

A

The penalty (lambda) is applied proportionally to squared coefficient values.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

How does LASSO work?

A

The penalty (lambda) is applied proportionally to absolute coefficient values.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What does LASSO stand for?

A

Least Absolute Shrinkage and Selection Operator

17
Q

What is L1 Regularisation?

18
Q

What is L2 Regularisation?

A

Ridge Regression

19
Q

How is the L1 Norm Calculated?

A

The sum of the absolute vector values

20
Q

How is the L2 norm calculated?

A

The square root of the sum of squared vector values

21
Q

How does regularisation perform feature selection?

A

It preforms feature selection by shrinking the contribution of the features.