Support Vector Machines Flashcards

1
Q

The basic idea

A

SVMs create a feature space, which represent the features of an object. Let’s say we want to classify images as dog vs cat.

Our features are weight, ear size, and nose type - these are our axis labels.

We then map all of our data points onto this vector space and see if we can draw a plane (line) that successfully separates dogs from cats.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Margin Classifier

A

This is where you draw your line / plane to classify your points.

You can use a:
maximal margin classifier (essentially the midpoint), sensitive to outliers - which can render this margin classifier pretty useless

soft margins - allow for a certain amount of misclassifications

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Soft margin classifiers

A

AKA support vector classifier

How to we find the optimal one?

We use cross validation to determine how many misclassifications and observations to allow inside the soft margin

Observations on the edge and within the soft margin are called support vectors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Unique attributes for SVMs

A

They can handle outliers via using support vector classifiers / soft margins

They allow for misclassifications and can handle overlapping classifications

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What to do if your soft margin has a ton of overlap?

A

SUPPORT VECTOR MACHINES!!

Support vector machines use some sort of y = f(x) to map your same data into a higher dimensional space. From there it sees if it can draw a line in that higher space that can successfully classify your points

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Kernel Trick

A

The kernel trick reduces the amount of computation required for support vector machines by avoiding the math that transforms the data from low to high dimensions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly