Artificial Intelligence & Machine Learning
Epoch
Definition
In machine learning, an epoch is one complete pass through the entire training dataset. Models are typically trained for multiple epochs.
Why It Matters
Training a model for just one epoch is not enough. The model needs to see the data multiple times to learn the underlying patterns effectively. The number of epochs is a key hyperparameter to tune.
Contextual Example
If a dataset has 1000 images and the batch size is 100, one epoch will consist of 10 iterations (10 batches). If you train the model for 50 epochs, it will have seen the entire dataset 50 times.
Common Misunderstandings
- Training for too few epochs can lead to underfitting.
- Training for too many epochs can lead to overfitting.