K-NN Flashcards

1
Q

How do we classify a point using K-NN?

A

Look at the k nearest points. Assign the most common class.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How do we use K-NN for regression?

A

Look at the K nearest neighbours, and then take the average of those neighbours.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the L2 Norm?

A

The Euclidean Distance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What can we use instead of Euclidean Distance for K-NN?

A

Similarity Measures

  • Inner Product
  • Cosine Similarity
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the Minkowski distance?

A

A generalisation of the euclidean distance,

where instead of squaring and square rooting

we use t

so

(|a-b|^t) ^ (1/t)

this extends to higher dimensions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the Hyper-parameter in K-NN?

A

The neighbour number K

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

For a classification using N classes, what choices of K should we avoid when using K-NN?

A

We should never set K to equal aN

Setting K to be a multiple of N may result in ties, making it impossible to come to a decision.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What may be the consequence of choosing a small K for K-NN?

A

We may model noise

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What may be the consequence of choosing a large K for K-NN?

A

We may include too many samples from other classes,

which can negatively affect our prediction.

Gives us an inaccurate prediction.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How many parameters are used in K-NN?

A

0 parameters used.

This is a non-parametric model.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly