Artificial Intelligence & Machine Learning

Regression

Definition

Regression is a type of supervised machine learning task where the goal is to predict a continuous output value. The model learns a function that maps input variables to a continuous output variable.

Why It Matters

Regression is used for forecasting and prediction problems, such as predicting prices, temperatures, or sales figures.

Contextual Example

Predicting the price of a house based on its features (square footage, number of bedrooms, location) is a regression problem. The output (price) is a continuous value.

Common Misunderstandings

  • It is one of the two main types of supervised learning, the other being classification.
  • Linear regression is the simplest type of regression algorithm.

Related Terms

Last Updated: December 17, 2025