Artificial Intelligence & Machine Learning

Regularization

Definition

Regularization is a set of techniques used to prevent overfitting in machine learning models. It works by adding a penalty term to the loss function that discourages the model from becoming too complex.

Why It Matters

Regularization is a crucial technique for improving the generalization of a model, making it perform better on new, unseen data.

Contextual Example

L1 and L2 regularization are common techniques. They add a penalty based on the size of the model's weights. This encourages the model to use smaller weights, which leads to a simpler, less overfitted model.

Common Misunderstandings

  • Dropout is another popular regularization technique used in neural networks.
  • Regularization is a way to combat high variance in a model.

Related Terms

Last Updated: December 19, 2025