Artificial Intelligence & Machine Learning

Linear Regression

Definition

Linear regression is a basic and commonly used type of predictive analysis. It is a statistical approach for modeling the relationship between a dependent variable with a given set of independent variables by fitting a linear equation to the observed data.

Why It Matters

Linear regression is often the first algorithm that machine learning practitioners learn. It is a simple, interpretable model that serves as a good baseline for regression problems.

Contextual Example

You could use linear regression to model the relationship between years of experience and salary. The model would find the best-fit straight line that predicts salary based on experience.

Common Misunderstandings

  • It assumes a linear relationship between the input and output variables.
  • It is a supervised learning algorithm.

Related Terms

Last Updated: December 17, 2025