Bias and Variance Flashcards

1
Q

What is variance?

A

This is a measure in how spread out the values of a data set are

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the bias?

A

This is how accurate the data points are to the target value

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What happens with overfitting with regards to the bias and variance?

A

When we start overfitting we get a better representation of the function, lowering the bias, but we become more dependent on the data set, so the variance becomes higher.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the equation for bias-variance decomposition?

A

ED[(y - f)2] = ED [(y - ED [y])2] + ED[(ED[y] - f)2]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Which part of the decomposition is the variance?

A

Var = ED [(y - ED [y])2] This is the difference between the model and the mean of the model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Which part of the decomposition is the bias2?

A

Bias = ED[(ED[y] - f)2] This is the distance between the mean and the actual value.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What happens during training (and overfitting) to the bias and variance?

A

We are shifting the error from the bias to the variance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What happens to the expected error of a model during training?

A

The error of the model does not change but it is shared and shifts between bias and variance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How can the bias and variance be shown for this graph? [Picture 8]

A

> The distance between the average of all the possible models that can be learned and the original function is the bias

> If I take all possible models, the difference between the models and the average is the variance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly