Photo by Sigmund on Unsplash

So, it took me a lot of struggling to understand what exactly “Monad” is! It has been explained everywhere in terms of either its formulas or in such long articles that it was almost impossible to comprehend it. Here, I’ll be explaining the complex logic in simple words. Also, in case you are interested in watching a video for this topic, do check out the video below.

Also, for ease of understanding, I’ll be coding in Python so that Haskell’s syntax doesn't scare you away.

Before starting, let me first introduce you to the main feature of Haskell, because…


Photo by Alexander Krivitskiy on Unsplash

Fourth CNN?

This is my fourth post in the series of Do-it-yourself CNN models. As usual, the format will remain the same. I’ll give you a Colab file (which works perfectly fine), which you have to run without any pre-conditions before moving forward. The reason being, once you run it yourself you become much more attached to the problem and you tend to get deeper insights.

You can check my previous posts in this series here:

In this post, we will be solving and understanding one of the most important…


Photo by Ron Dauphin on Unsplash

Third CNN?

This is my third post in the series of Do-it-yourself CNN models. As usual, the format will remain the same. I’ll give you a Colab file (which works perfectly fine), which you have to run without any conditions before moving forward. The reason being, once you run it yourself you become much more attached to the problem and you tend to get deeper insights.

You can check my previous posts in this series here:

So, without further ado, let me handover to you the Colab file 🚀. This is not my…


Photo by Tran Mau Tri Tam on Unsplash

Second Classifier?!

Does that sound weird? So, this is my second blog in the series of DIY CNN models. You can check the first classifier here. In case you haven't read my previous article, I would highly recommend doing so since I’ll be building on top of it. The format of this blog will remain exactly the same. I’ll give you a Colab file, which you have to run (no ifs no buts) before proceeding further.

So here you go — Just Run it! 🏃‍♀️

Yes, I know it will take some time to run. Till then, feel free to…


Photo by Jon Tyson on Unsplash

Digit classifier is apparently the new Hello World!

Yes, this is the most basic Convolutional Neural Network example. Therefore understanding this one is of prime importance. So, whenever we start something new, running our first hello world program might seem easy but has a lot of friction attached to it. Running it successfully can sometimes take days — primarily because of the setup involved, getting caught in bugs which looks like alien text to us, etc.

So, my main motivation behind writing this article is to help you in removing this friction — so that you can run your first-ever…


This is my new series in deep learning, where I’ll be writing about real life applications of NN. Here, we will be understanding each of them in detail and will try to get the intuition behind the same. We will deep dive into both theoretical and practical concepts.

In case you haven’t gone through my previous series of mathematical intuition behind NN, I would highly recommend to do so. We might be building concepts on top of that.

Problem Statement

Who likes to watch movies here?! And who here have checked IMDB ratings before watching any movie? So, adding reviews on movies…


This is my fifth article in the series of learning basic maths behind neural networks. You can check out my previous articles here:

  1. Neural Network Maths
  2. Convolution Neural Networks Maths
  3. Recurrent Neural Network Maths
  4. Long Short Term Memory Maths — Part 1

In the last article we covered the basics of LSTM and discussed the forward propagation for the same.

Revision

This is the diagram that we draw for understanding LSTM. At this point, incase you haven’t gone through my previous article, I would highly recommend to check it out before proceeding further.


This is my fourth article in the series of learning basic maths behind neural networks. You can check out my previous articles here:

  1. Neural Network Maths
  2. Convolution Neural Networks Maths
  3. Recurrent Neural Network Maths

Coming to LSTM, this is the most daunting of all. Don’t believe me! Just have a look at it..


So, there are tons of content on neural networks but, rarely are these focused on the maths. Even if there is such content, it’s so much complex that we just leave it. And I believe, without that deeper understanding of how maths is actually working, it’s really difficult to get the proper intuition and it will always be a magical black box to us.

Nature of the article

This is my third blog in the series. In case you haven’t checked my previous articles I would highly recommend doing so since I’ll build concepts on top of those. As usual, will keep it to…


It took me soo much time to understand how CNN is working. And trust me, the content on this is incredibly low, like really low. Everywhere they will tell you how forward propagation works in CNN, but never even start on backward prorogation. And without understanding the full picture, one’s understanding always remains half-baked.

Pre-requisites

  1. Should be familiar with Basics of CNN — Convolution layer, Max Pooling, Fully Connected layer. Do a basic google search and understand these concepts. Should take an hour or so to get started.
  2. Differential calculus — Should know how chain rule works and basic rules of…

Vidisha Jitani

Metamorphosis 💫

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store