Artificial Intelligence & Machine Learning

Confusion Matrix

Definition

A confusion matrix is a table used to describe the performance of a classification model on a set of test data for which the true values are known. It shows the number of true positives, true negatives, false positives, and false negatives.

Why It Matters

A confusion matrix provides a much more detailed view of a classification model's performance than accuracy alone. It allows you to see exactly where the model is getting confused and what types of errors it is making.

Contextual Example

A confusion matrix for a spam filter would show how many legitimate emails were correctly identified, how many were incorrectly marked as spam (false positives), how many spam emails were correctly identified, and how many were missed (false negatives).

Common Misunderstandings

  • The confusion matrix is the basis for calculating other metrics like precision, recall, and F1-score.
  • It is an essential tool for evaluating any classification model.

Related Terms

Last Updated: December 17, 2025