Topic 20 Flashcards
(9 cards)
Confusion Matrix
A table that counts the number of predictions of each class divided up by the actual class labels):
Precision
number of true positives /
number of true positives + number of false positives (accuracy of the model predicting yes)
Recall / Sensitivity
number of true positives /
number of true positives + number of false negatives (proportion of yes that were predicted correctly)
F1 Score
2 ∗ precision × recall /
precision + recall
Cohen’s Kappa
k = 1 - 1-p0 / 1-pe where po is the proportion of times our model is correct and pe is the propotion of times a random model is correct
Specificity
Number of true negatives / number of true negatives + number of false positives (accuracy of the model predicting no)
Accuracy
number of true positives + number of true negatives /
number of true positives + number of true negatives+number of false positives + number of false negatives (combination of precision, recall and specificity)
ROC Curve
A plot considering every possible decision boundary and comparing the true positive and false positive rate
Area Under the Curve (AUC) for ROC Curve
The proportion of 1x1 unit square that falls below the ROC curve (higher is better)