What is accuracy and what is the formula?
Measures how frequently the classifier is correct, with respect to a fixed test set with known labels.
Accuracy = The number of correctly classified test instances (True negs and positives) / total number of test instance
What is the formula for error rate?
The error rate is 1 - Accuracy, shows
What is precision and what is its formula?
Precision is how often are we correct when we predict that an instance is positive.
TP / (TP+FP)
What is recall and what is the formula for it?
Recall is what proportion of the truly positive instances have we correctly identified as positive
Recall = TP / (TP + FN)
What is the F-Score and what is the formula used to calculate it?
Represents the harmonic mean of both precision and recall, representing these two statistics as one.
F-Score = F1 = 2PR / (P + R)
What is the Kappa Statistic? What is the formula to calculate it?
Used to measure the level agreement between to raters or classifiers.
K = (po - pe) / (1 - pe)
Where po is relative observed agreement among raters
Where pe is hypothetical probability of chance agreement.
Calculate Kappa coefficient for the following table
YES NO YES 25 10 NO 15 20
k = (po – pe) / (1 – pe)
k = (0.6429 – 0.5) / (1 – 0.5)
k = 0.2857