Artificial Intelligence & Machine Learning

Accuracy

Definition

In classification tasks, accuracy is a metric that measures the number of correct predictions made by a model as a percentage of the total number of predictions.

Why It Matters

Accuracy is the most intuitive and common metric for evaluating a classification model. It gives a simple, overall score of how well the model is performing.

Contextual Example

If a model correctly identifies 95 out of 100 images, its accuracy is 95%.

Common Misunderstandings

  • Accuracy can be misleading on imbalanced datasets. If a disease occurs in only 1% of the population, a model that always predicts "no disease" would have 99% accuracy but would be completely useless.
  • For this reason, other metrics like precision, recall, and F1-score are often more informative than accuracy alone.

Related Terms

Last Updated: December 17, 2025