Activation Functions | Deep Learning Tutorial 8 (Tensorflow2.0, Keras & Python)

Activation functions are very important in building a non linear model for a given problem. In this video we will cover different activation functions that are used while building a neural network. We will discuss these functions with their pros and cons,
1) Step
2) Sigmoid
3) tanh
4) ReLU (rectified linear unit)
5) Leaky ReLU
We will also write python code to implement these functions and see how they behave for sample inputes.

Github link for code in this tutorial: :

Next video:

Previous video:

Deep learning playlist:
Machine learning playlist : 

Prerequisites for this series:  
   1: Python tutorials (first 16 videos):   
   2: Pandas tutorials(first 8 videos):
   3: Machine learning playlist (first 16 videos):