Artificial Intelligence & Machine Learning

Model

Definition

In machine learning, a model is the artifact that is created by the training process. It is a mathematical function that takes an input and produces a prediction or decision as output. It represents the "learned" patterns from the training data.

Why It Matters

The model is the end product of machine learning. It is the thing you deploy into an application to make it "smart." The quality of the model determines the quality of the AI's predictions.

Contextual Example

After training a model on thousands of house photos and their prices, the resulting model can take a new photo of a house and output an estimated price. This trained model can then be integrated into a real estate website.

Common Misunderstandings

  • The model is not the algorithm. The algorithm (e.g., linear regression) is the process used to learn from the data; the model is the specific function with the learned parameters that results from this process.
  • Models need to be regularly updated or retrained as new data becomes available to prevent them from becoming stale ("model drift").

Related Terms

Last Updated: December 17, 2025