Artificial Intelligence & Machine Learning

Backpropagation

Definition

Backpropagation is the primary algorithm for training artificial neural networks. It works by calculating the "error" or "loss" of the model's prediction compared to the correct output, and then propagating this error backward through the network to calculate how much each weight and bias contributed to the error.

Why It Matters

Backpropagation is what makes training deep neural networks computationally feasible. It is an efficient way to calculate the gradients (derivatives) of the loss function with respect to all the weights in the network, which are then used by an optimization algorithm like gradient descent to update the weights.

Contextual Example

After a neural network makes a prediction, backpropagation determines whether each weight should be adjusted slightly up or slightly down to make the next prediction more accurate.

Common Misunderstandings

  • Backpropagation is an application of the "chain rule" from calculus.
  • It is used in conjunction with an optimization algorithm like gradient descent.

Related Terms

Last Updated: December 17, 2025