Exercise 2 - ML part 2 Flashcards

1
Q

what does the model M do?

A

encode, stores and retrieves the outcomes of a learning process

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What does H do?

A

The hypothesis space determines which aspects of the data are captured and how they are represented

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How is it called when you combine hypothesis?

A

Ensemble method

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is boosting

A

computes a strong learner by incrementally constructing an ensemble of hypothesis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

law of parsimony that is stated in the principle of Occam’s razor

A

Of two competing theories, the simpler explanation of an entity is to be preferred

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

discriminate data

A

compute prediction y for an input x

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What are discriminative models based on?

A

posterior probabilities P(y|x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What are generative models based on?

A

prior probabilities P(x|y)

  • -> How likely is it to have data of a certain label?
  • -> Can be computed via the Bayes’ algorithm
  • -> Generative models are compact representations that have considerably less parameters
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How can overfitting be detected?

A

By applying h to unseen data samples

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Avoidance of overfitting through

A
  • regularization
  • more training data (increasing the complexity of the dataset)
  • dataset augmentation (e.g. adding noise)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

value domain of learning methods

A
  • discrete (classification)

- continuous (regression)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

different models

A

deterministic - stochastic
parametric - nonparametric
generative discriminative

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

nonparametric

A

An easy to understand nonparametric model is the k-nearest neighbors algorithm that makes predictions based on the k most similar training patterns for a new data instance. The method does not assume anything about the form of the mapping function other than patterns that are close are likely to have a similar output variable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

parametric

A

e. g. linear function

- you make assumptions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Types of reasoning

A
  • inductive
  • deductive
  • transductive
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

perceptron

A

Linear classifier that is based on a single Neuron with a digital threshold function. It outputs +1 (if x >=0) and -1 (if x < 0).
- the perceptron criterion punishes only incorrectly classified samples

17
Q

Can the perceptron learning rule find a solution?

A

If a solution exists, so if the data set is linearly separable, then the perceptron learning algorithm finds a solution within a finite number of steps
- the solution depends on the initialization of the parameters and the order of presentation of the training samples

18
Q

What led to the AI winter?

A
  • The perceptron learning rule only converges for linearly separable data sets. It therefore cannot classify the XOR dataset correctly. This led to the abandonment of connectionism for almost two decades.
19
Q

Examples of basis functions

A
  • linear functions

- sigmoid functions etc.

20
Q

What is interpolation

A

a function that maps to every training data instance

21
Q

Is an analog neuron model a generalized linear model?

A

yes

22
Q

Issues of Least Squares Linear Classification

A

sensitive to outliers

23
Q

What solves the problem of Least Squares Linear Classification

A

Support Vector Machines (SVM) minimizes the generalization error.