leriomaggio
released this
22 Aug 13:13
·
7 commits
to master
since this release
Deep Learning with Keras and Tensorflow
Valerio Maggio: PostDoc Data Scientist @ FBK/MPBA
Contacts:
@leriomaggio | [email protected] |
Installed Versions
import keras
print('keras: ', keras.__version__)
# optional
import theano
print('Theano: ', theano.__version__)
import tensorflow as tf
print('Tensorflow: ', tf.__version__)
keras: 2.0.4
Theano: 0.9.0
Tensorflow: 1.2.1
Outline
-
Part I: Introduction
-
Intro to Artificial Neural Networks
- Perceptron and MLP
- naive pure-Python implementation
- fast forward, sgd, backprop
-
Introduction to Deep Learning Frameworks
- Intro to Theano
- Intro to Tensorflow
- Intro to Keras
- Overview and main features
- Overview of the
core
layers - Multi-Layer Perceptron and Fully Connected
- Examples with
keras.models.Sequential
andDense
- Examples with
- Keras Backend
-
-
Part II: Supervised Learning
-
Fully Connected Networks and Embeddings
- Intro to MNIST Dataset
- Hidden Leayer Representation and Embeddings
-
Convolutional Neural Networks
-
meaning of convolutional filters
- examples from ImageNet
-
Visualising ConvNets
-
Advanced CNN
- Dropout
- MaxPooling
- Batch Normalisation
-
HandsOn: MNIST Dataset
- FC and MNIST
- CNN and MNIST
-
Deep Convolutiona Neural Networks with Keras (ref:
keras.applications
)- VGG16
- VGG19
- ResNet50
-
-
Transfer Learning and FineTuning
-
Hyperparameters Optimisation
-
-
Part III: Unsupervised Learning
- AutoEncoders and Embeddings
- AutoEncoders and MNIST
- word2vec and doc2vec (gensim) with
keras.datasets
- word2vec and CNN
- word2vec and doc2vec (gensim) with
-
Part IV: Recurrent Neural Networks
- Recurrent Neural Network in Keras
SimpleRNN
,LSTM
,GRU
- LSTM for Sentence Generation
- Recurrent Neural Network in Keras
-
PartV: Additional Materials:
- Custom Layers in Keras
- Multi modal Network Topologies with Keras