Overfitting and Underfitting in Deep Learning
Introduction
Before starting overfitting and underfitting in deep learning you must go through the previous blog related to deep learning https://ainewgeneration.com/activation-function-in-deep-learning/ So now we will focus more on overfitting and underfitting in deep learning what are their concepts are and how we can overcome overfitting in-network. In brief overfitting, in-network arise when your model gives excellent accuracy on training data but fails to give accuracy on testing data While in the case of underfitting it is quite different the model is unable to capture the pattern in the training dataset which result in the model doesn’t learn from training data which directly indicates both training and the testing accuracy is low Now let’s get started in details.
Table of Contents
- Overfitting in Network
- Accuracy & Loss Curve vs Epochs while overfitting
- Underfitting in Network
- Overfitting vs Underfitting Model
- How Can we Reduce Overfitting ?
Overfitting in Network
A situation where your networks have only learned the training data, but cannot cope up with the test data. It performs well on training data in other words overfitting in Network arises when our model gives good accuracy in training data and fails to give accuracy on testing data as it only performs well in training data.
In the above figure values, the green line indicates the over-fitted model and the black line indicates the regularized model. while the green line correctly follows the training data. is highly dependent on data and may have a higher error rate than new unseen information, compare black line.
Accuracy & Loss Curves vs Epochs while overfitting
Accuracy vs Epochs Curves

From the above graph accuracy vs epochs, it is clearly observed that the training accuracy is increasing after every epoch while in the case of testing/validation accuracy it increases up to certain epochs after that the testing/validation accuracy become saturated and start decreasing which directly indicates as we discuss above that training accuracy is high while testing/validation accuracy is low which completely in the case of overfitting.
Loss vs Epochs Curves

From the above graph Loss vs epochs, it is clearly observed that the training Loss is decreasing after every epoch while in the case for testing/validation Loss it decreases up to certain epochs after that the testing/validation loss become saturated and start increasing which directly indicates as we discuss above that training Loss is low while testing/validation loss is high which completely indicates the feature of overfitting in the network.
Underfitting in Network

Underfitting occurs when the model doesn’t give good accuracy in both training and testing datasets. the reason behind the underfitting of the network is model unable to find patterns in the dataset usually occurs when we have a small amount of data to build an accurate model and also when we try to build a model that matches the data that is not accurate. In such cases, the rules of the machine learning model are too simple and flexible to be applied to such a small amount of data so the model will make many incorrect predictions. Less performance can be avoided by using more data and reducing features by feature selection.
Overfitting vs Underfitting Model

Overfitting(High Variance) – A situation where your networks have only learned the training data, but cannot cope up with the test data. It performs well on training data in other words overfitting in Network arises when our model gives good accuracy in training data and fails to give accuracy on testing data as it only performs well in training data.
Underfitting(High Bias) – Underfitting occurs when your model doesn’t give good accuracy in both training and testing datasets. the reason behind the underfitting of the network is model unable to find a pattern in the dataset usually occurs when we have a small amount of data to build an accurate model and also when we try to build a model that matches the data that is not accurate
Best Fit: Best fit lies between overfitting and underfitting model as when you want to train the model you should aware after which certain epochs it started overfitting as you know in overfitting validate accuracy start decreasing after certain no. of epochs or iterations So select the particular epoch as best epoch as after its more prone to overfitting we also called at certain epoch we stop the training when validation accuracy start decreasing called as Early Stopping
How Can we Reduce Overfitting ?
There are some techniques we can reduce overfitting in network
Without adding any regularization technique :
- Train the Model on more no. of data
- Use of Data Augmentation Technique
- Use Early Stop (when Validation error start increasing or validation error not decreasing)

With adding regularization technique :
- L1 & L2 regularization
- Dropout
We will go through each and every regularization technique in more detail in coming blogs.
End Notes :
I hope Overfitting and Underfitting blog might help you better understand each and every step in much details in the coming blog we will go more details in each regularization technique in deep learning.
Tag:deep learning, overfitting