evaluating models Flashcards

1
Q

How do we rely on models?

A

we need to know how good they will be at their job before relying on them

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Why one must evaluate classifier performance?

A

-multiple methods are available to classify or predict
-for each method, multiple options are available
-to choose the best model, one needs to assess each model’s performance, relative to others

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the misclassifcation effors?

A

-error
-error rate
-accuracy

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

what is error under misclassifcation error?

A

classifying a record as belonging to one class when it belongs to another class

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is error rate under misclassification error?

A

percent of misclassifed records ou of the total records in the valdidation data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

what is accuracy under misclassification error

A

accuracy = 1 - error rate

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is confusion matrix

A

it summarizes binary classification error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

what is accuracy

A

the overall probability the model will predict actual outcomes correctly TP+ T N / (P+N) slide 15 of eval. models

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

what is true positive rate

A

its the sensitivity
- the proporition of events correctly predicted

  • of all the times the event occurred, what is the proporitoin of times it was precicted to occur, the model’s hit rate?
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the formula of True Positive Rate?

A

TP/(TP+FN)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

WHAT IS THE INVERSE OF SENSITIVITY ?

A

miss rate

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is a true negative rate?

A
  • the proportion of non-events correctly predicted
  • of all the times ethe event did not occur, what was the proportion of times it was predicted to not occur?
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is positive predicitive value

A

the proption of postive event predictions that were correct
- more false positives erode precision, many false claims the event will occur means your model is less discerning when predicting the event

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

what is the relationship between sensitivity and precision?

A

if an event occurs 10/1000 cases then its rare
- if the model predicts 9 of the 10 , it is highly sensitive to the phenomen
-yet if the model also predicts the event 900 more times then the model is not precise

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

not all errors are equal

A

doctor example, false exam of meningitis or dying at tnight because doctor mistreated u

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

what is the accounting profit formula

A

interest revenue- bad loan expense

17
Q

what is a false positive rate

A

an error rate that events predict to occur that didnt occur

18
Q

what is FPR the inverse of?

A

TNR

19
Q

TNR + FPR = 1.0, Soooo

A

FPR= 1- TNR

20
Q

as proporition of non-events correctly predicted _____, the proportion of false alarms will decrease by the same degree

A

increases