C2 Flashcards

1
Q

perceptron training for classification task

A

try to find suitable values for the weights in such a way that the training examples are correctly classified

case of two classes: try to find a hyper-plane that separates the examples into these two classes (linearly separable if there exists a hyperplane that separates them)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

perceptron learning algorithm

A

initialize weights w randomly
while (there are misclassified training examples):
select a misclassified example (x,d)
w_new = w_old + etadx

d = difference between the target value and the predicted value

if x is misclassified and d = 1, wx should be bigger
if x is misclassified and d = -1, wx should be smaller

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Cover’s theorem

A

in a highly dimensional space, if the number of points is relatively small compared to the dimensionality and you paint the points randomly in two colors, the data set will be linearly separable

  1. if the number of points in a d-dimensional space is smaller than 2*d, they are almost always linearly separable
  2. if the number of points in a d-dimensional space is bigger than 2*d, they are almost always NOT linearly separable
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

support vector machines

A

the decision boundary should be as far away from the data of both classes as possible (maximize the margin m)

performs very well on highly dimensional data

computationally very expensive

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

perceptron

A

neural network without hidden layer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

> > momentum, Nesterov accelerated gradient??

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly