Confusion Matrix
Used for assessing model performance
Sensitivity
Also known as true positive rate, or recall:
Specificity
When is sensitivity more important?
When False Positives are acceptable but False Negatives are not. E.g. Detecting fraudulent transactions, medical diagnosis
When is specificity more important?
When False Negatives are acceptable but False Positives are not. E.g. model that ensures images are appropriate for children.
Accuracy
The proportion of all predictions that were correctly identified. I.e. how right is the model?
- (TPs + TNs) / Total
Precision
The proportion of all actual positives that were correctly identified
- TPs / (TPs + FPs)
ROW / AUC
Gini Impurity
Gini Impurity (calculation)
1 - (probability of class 1)^2 - (probability of class 2)^2
F1 Score
Linear regression metrics: SSE, Rsquared, Adj RSquared
Linear regression metrics: Confidence intervals