Activation Functions | Deep Learning Tutorial 8 (Tensorflow2.0, Keras & Python)


Activation functions are very important in building a non linear model for a given problem. In this video we will cover different activation functions that are used while building a neural network. We will discuss these functions with their pros and cons,
1) Step
2) Sigmoid
3) tanh
4) ReLU (rectified linear unit)
5) Leaky ReLU
We will also write python code to implement these functions and see how they behave for sample inputes.

Github link for code in this tutorial: : https://github.com/codebasics/py/blob/master/DeepLearningML/2_activation_functions/2_activation_functions.ipynb

Next video: https://www.youtube.com/watch?v=cT4pQT5Da0Q&list=PLeo1K3hjS3uu7CxAacxVndI4bE_o3BDtO&index=9

Previous video: https://www.youtube.com/watch?v=iqQgED9vV7k&list=PLeo1K3hjS3uu7CxAacxVndI4bE_o3BDtO&index=7

Deep learning playlist: https://www.youtube.com/playlist?list=PLeo1K3hjS3uu7CxAacxVndI4bE_o3BDtO
Machine learning playlist : https://www.youtube.com/playlist?list=PLeo1K3hjS3uvCeTYTeyfe0-rN5r8zn9rw 

Prerequisites for this series:  
   1: Python tutorials (first 16 videos): https://www.youtube.com/playlist?list=PLeo1K3hjS3uv5U-Lmlnucd7gqF-3ehIh0   
   2: Pandas tutorials(first 8 videos): https://www.youtube.com/playlist?list=PLeo1K3hjS3uuASpe-1LjfG5f14Bnozjwy
   3: Machine learning playlist (first 16 videos): https://www.youtube.com/playlist?list=PLeo1K3hjS3uvCeTYTeyfe0-rN5r8zn9rw