Basics Flashcards

1
Q

Give examples of self-supervised learning

A

Images: patches shuffle, inpainting. NLP: auto-regressive LLMs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Explain Bias variance trade-off.

A

Bias: expected In case of MSE: generalisation error is irreducible + bias^2 + variance. Bias is E[f(n)-f*(n)]. Variance is E[E[f(n)]-f(n)]. Expectation is taken with respect to current training dataset of finite size.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is empirical risk minimisation principle?

A

Ideal goal: find algorithm which minimises expected (over samples) error. In practice we minimise error on training dataset. The principle says that we should use the algorithm which has the lowest error on the training set.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What’s VC dimensionality? What vc dimensionality of logistic regression?

A

It’s the largest dataset size for which any labelling can be achieved by the classifier (shattered). For linear classifier it’s three: all labelling of “triangles” of three points can be achieved though “square” with zeros on one diagonal, ones on another diagonal is not possible with a straight line.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly