Transfer Learning in Deep Learning AI …PM Modi Vs President Trump Classification

Akshay Bhor ..an Data Scientist
3 min readMar 1, 2020
PM Modi and President Trump

Introduction:

To work our Deep learning model well we need an sufficient data to train it.

but,its not always possible that we have that much of data,sometimes in industry too we get much less data from my personal experience as an deep learning Engineer.

Also,even we have an large dataset ,Deep convolutional neural network models may take days or even weeks to train on very large datasets.

So,we have following questions sometimes in front of us:

  1. I have very low data and my model is not performing well and getting very low accuracy.
  2. Even though i have an enough data ,my model taking hours to train it and not sure when will it gets done.
  3. I have enough data to train ,I am getting well test accuracy but failed largely to predict on unseen future data.

Why Transfer Learning ??

There are many neural networks which trained on very very large dataset consists of millions of train image dataset .called as pre-trained neural network with optimized weights at each layer with around 1000’s class labels.

so,that model already learned the numerous amount of features from train data,then there is no valid point of using our own model which train on low trained data,because our model will not learned as much as pretrained models.

and that is where concept of transfer learning comes in picture…

use the pretrained model,with certain changes with its top layer to get higher test accuracy on future unseen data.

Models For Transfer Learning

There are many pretrained models we can use in transfer learning,but some of the consistent models peoples use are,

  1. VGG16 or VGG19
  2. inception
  3. Resnet 50

How Transfer Learning Works??

Lets, see the coding of transfer learning…

we have an dataset of 2 class classification with label as PM_modi and President _Trump.

but our dataset is limited to only around 180 images per class.

and if we trained Custom deep learning model with these little amount of data the we certainly fail to get even 50% of test accuracy on future unseen data.

so,here we used VGG16 pretrained model.we import that model with TOP=False, as we have to build our custom layers with 2 class classification on the top of it.

Adding Layers on Pretrained VGG16 model

so,our model summary is:

Then ,we made Our models layers except last few layers as Trainable=False,

as we have to use the pretrained weights which will not take part in further training.

In ,next part i called image data generator which rotate ,zoom,flip etc our training data,so that data to model increased.

Call, Callbacks,compile model and run model

When run for multiple epoch i got an test accuracy upto 95%, and that is because of transfer learning even though i have low training data

and that is the success of transfer learning….

Final Result Prediction on unseen Data:

Unseen Data_1
Unseen data_2

yehhhh…..we got an result…

that is the same mechanism which google used earlier ,where we just have to upload the image and in return google will tell us what is this exactly…

You can check my Github account for all the details ,including notebook and training data…Link is:

https://github.com/akshayashokbhor/Transfer-Learning-Modi-Vs-Trump

Blog By:

Akshay Bhor: Deep learning Engineer

an Data Scientist

--

--