Week 2 - Convolutional neural network (CNN) with MNIST dataset

Week 2 of my deep learning plan was to train a convolutional network using the MNIST dataset. The intent is to learn the basics of convolutional networks. Instead of writing the code of my homework here again, I will only link to it on Github and speak about my experience with it here.

You can find the whole python notebook on Github here.

This homework was the fist step of feeling not like the hello-world example in week 1. The first step was to load the MNIST data like before, analyzing the data and curating it before processing. It was not complicated but it was a nice start to realizing how important this step is for deciding how to design the network.

Next was the mix of confusion and fun. It was basically about looking for the reasonable number of hidden layers and number of filters per layer. After finding the ‘popular’ numbers for such a problem (what if my problem is a personal/custom one? back to that later in the future inshallah), it was time to try differt depths and different values for hyperparameters like dropout and filters. This was actually a boring and a time consuming step. Just to try some basic variations, I had to keep my computer running for about 4 hours. And I didn’t even try so many variations (only 7 variations).

The results were fine and I could reach an accuracy of 99.32% compared to 98.48% in my previous feedforward assignment.

Week 3 & 4

Week 3 & 4 were actually about playing with hyperparameters and optimization. So it was somehow included in my current assignment. I could get a taste of how they can affect the results and how I should give a considerable time, changing and observing the effects of such parameters and deciding which parameters can introduce an accuracy improvement.

Resources:

comments powered by Disqus
rss facebook twitter github gitlab youtube mail spotify lastfm instagram linkedin google google-plus pinterest medium vimeo stackoverflow reddit quora quora