Skip to content Skip to footer
Search
Search
Search

Confusion Matrix

 

A confusion matrix, typically represented as a table, is a popular evaluation metric used to describe the performance of a classification model (or “classifier”). The table compares predicted and actual values. The basic components of the table are as follows:

    • True positives (TP): The prediction was yes, and the true value is yes

    • True negatives (TN): The prediction was no, and the true value is no

    • False positives (FP): The prediction was yes, but the true value was no

    • False negatives (FN): The prediction was no, but the the true value is yes

 

Related Metrics

The confusion matrix is closely related to other metrics like Precision, Recall/Sensitivity, Specificity, and F1 Score. Those definitions are as follows:

MetricFormulaDefinition
Accuracy(TP+TN)/(TP+TN+FP+FN)Percentage of total items classified correctly
PrecisionTP/(TP+FP)How accurate the positive predictions are
Recall/SensitivityTP/(TP+FN)True positive rate (eg to asses false positive rate)
SpecificityTN/(TN+FP)True negative rate (eg to assess false negative rate)
F1 score2TP/(2TP+FP+FN)A weighted average of precision and recall