I will update this repository to learn Deeplearning with Tensorflow and Keras
Day – 1: 25-8-2019
We Learnt about
- Basic building blocks of Neural Network
- Perceptron
- Neurons
- Hidden Layers
- Linear regression with Neural Networks
- Logistic regression with Neural Networks
- No Linear Activation Function
- tanh, step, logit, relu, elu
- Back propagation
- Vanishing and Exploding gradient descent
- Ways to avoid Vanishing and Exploding gradient descent
- How to mitigate over fitting ?
- Tensorflow – Keras practical
Day – 2: 31-8-2019
- Parameter explotion in image recognition
- Convolution layer – kernel , filter, Stride, Padding, feature map
- Pooling Layer – max, min, average
- CNN architecture
- Keras implementation
- Image recognition in comparison with Basis NN and CNN
- Advanced Deep CNN
- Pre Trained Models
- Transfer Learning – Resnet50
- Image Agumentation
- Tensor board
- Opencv, Yolo3
- Sample Hackathon
Day – 3: 01-9-2019
- Neural Network so far can only know what was passed in current time
- What if we want to remember last output to predict the future if it is a sequence data
- Neuron with memory
- RNN architecture
- Back Propagation Through Time (BPTT)
- Problem with BPTT
- Vanishing and Exploding gradient descent
- Truncated BPTT
- LSTM
- LSTM Architecture
- Keras LSTM implementation
References:
https://github.com/omerbsezer/LSTM_RNN_Tutorials_with_Demo#SampleStock
https://github.com/fchollet/deep-learning-with-python-notebooks/blob/master/8.1-text-generation-with-lstm.ipynb
https://github.com/dipanjanS/nlp_workshop_odsc19
https://github.com/buomsoo-kim/Easy-deep-learning-with-Keras
Leave a Reply