When training a deep learning model, you want it to learn patterns from the training data so it can make accurate predictions on new, unseen data. However, sometimes models learn too little or too much. This leads to underfitting or overfitting. Let’s break them down in simple terms, backed by examples, visuals, and some light math.
1. What Is the Goal of Training a Model?
Imagine you're trying to teach a model to predict house prices based on features like size, location, and number of rooms.
Your goal is to find a function f(x)
that maps your input features x
(like size, rooms) to a prediction ŷ
(the house price), such that the prediction is close to the actual price y.
ŷ = f(x;θ)
MSE = (1/n) ∑ (yᵢ - ŷᵢ)²
2. Underfitting
Underfitting happens when your model is too simple to capture the patterns in the data. It doesn’t learn enough from the training data and performs poorly on both training and testing sets.
Example:
Let’s say you’re using a straight line (linear model) to fit this data:
Left chart shows underfitting: a straight line trying to fit a curve
- Misses the curve in the data
- Has high bias (too simplistic assumptions)
- Makes large prediction errors
3. Overfitting
Overfitting happens when your model is too complex and learns not only the underlying pattern but also the noise in the training data. It performs very well on training data but poorly on unseen data.
Right chart shows overfitting: a very wiggly line fitting every point
- Tries too hard to memorize data points
- Fails to generalize to new data
- Has high variance
4. The Sweet Spot: Good Fit
A well-trained model should lie in the middle ground:
Middle chart: the model fits the general trend well without memorizing noise
- Captures the underlying trend
- Ignores noise
- Generalizes well to new data
5. Intuition with Bias-Variance Tradeoff
Definitions:
- Bias: Error from incorrect assumptions $\text{(too simple model)}$
- Variance: Error from model sensitivity to small changes in data
Total Error = Bias² + Variance + Irreducible Error
Situation | Bias | Variance | Error Type |
---|---|---|---|
Underfitting | High | Low | High training + test error |
Overfitting | Low | High | Low training error, high test error |
Just Right | Low | Low | Low generalization error |
6. Real-World Deep Learning Example
If you're training an image classifier using a convolutional neural network (CNN):
- Underfitting: Your CNN has only 1 or 2 layers → can’t capture complex features like edges or textures.
- Overfitting: Your CNN has 100+ layers and memorizes noise or shadows from training images.
Metric | Underfitting | Overfitting |
---|---|---|
Train Accuracy | Low | High |
Test Accuracy | Low | Low |
Gap (Train - Test) | Small | Large |
7. It’s Central to Generalization – the Whole Point of ML
In ML, we care about generalization — how well the model performs on data it hasn’t seen.
- Underfitting and overfitting are both failures to generalize, just in opposite directions.
- Your job is to find the right balance where the model learns enough but not too much.
Understanding this tradeoff helps:
- Pick the right model complexity
- Choose good architectures
- Know when to regularize, simplify, or train longer
8. How to detect and fix
1) Fix Underfitting:
- Use more complex models
- Add more features
- Train longer
- Reduce regularization
2) Fix Overfitting:
- Add more data
- Use data augmentation
- Apply regularization (L1, L2, Dropout)
- Use early stopping
- Simplify model (fewer layers, fewer neurons)
Comments
Post a Comment