8.5 Evaluating Predictive Models – Accuracy & Beyond Flashcards

1
Q

Why can accuracy be misleading in machine learning?

A

Because it doesn’t account for imbalanced data or the type of errors being made.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

When might a model with 99.9% accuracy be considered bad?

A

When predicting rare events like fraud, since predicting “not fraud” all the time can still give very high accuracy but miss actual fraud cases.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

In critical applications like medical diagnosis, what matters more than overall accuracy?

A
  • The type of error
  • e.g., missing a sick patient (false negative) can be more dangerous than giving treatment to a healthy person (false positive).
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is a True Positive (TP) in binary classification?

A

A positive case correctly predicted as positive.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is a True Negative (TN)?

A

A negative case correctly predicted as negative.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is a False Positive (FP)?

A

A negative case incorrectly predicted as positive.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is a False Negative (FN)?

A

A positive case incorrectly predicted as negative.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What does a confusion matrix show?

A

It shows how predicted classes relate to actual classes, helping to identify false positives and false negatives.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Why is the confusion matrix useful?

A

it helps us inspect the types of errors the model is making, not just the number.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is Accuracy as a metric?

A

The percentage of all predictions that were correct.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is Precision?

A

The percentage of positive predictions that were actually positive.
E.g., “Of the customers we said would buy, how many actually did?”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is Recall?

A

The percentage of actual positives that were correctly predicted.
E.g., “Of all customers who actually bought, how many did we correctly identify?”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly