TransformIT NVIDIA Training

DLI Teaching Kit Syllabus
Module 1: Introduction to Machine Learning
Lecture Slides
● 1.1 Course Introduction
● 1.2 Introduction to Machine Learning
● 1.3 Introduction to Neural Networks
Module 2: Introduction to Deep Learning
Lecture Slides
● 2.1 Introduction to Deep Learning
● 2.2 Deep Supervised Learning (modular approach) – Part 1
● 2.3 Deep Supervised Learning (modular approach) – Part 2
NVIDIA DLI Qwiklab 1: Image Classification with NVIDIA DIGITS (link)
Lab 1
● 1.1 Backpropagation
○ Logistic Regression
○ Softmax Expression
● 1.2 MNIST Handwritten Digit Recognition (Torch) (programming)
NVIDIA DLI Qwiklab 2: Object Detection with NVIDIA DIGITS (link)
Module 3: Convolutional Neural Networks
Lecture Slides
● 3.1 History of Convolutional Networks
● 3.2 Convolutional Networks and Computer Vision, Audio and Other Domains
● 3.3 Structural Prediction and Natural Language Processing
Lab 2A
● 2A.1 More Backpropagation
● 2A.2 STL10: Semi-supervised Image Recognition (Torch) (programming)
○ Visualizing filters and augmentations
○ t-SNE
1
Lab 2B
● 2B.1 Backpropagation
○ Nonlinear activation functions
○ Softmax
● 2B.2 Techniques
○ Optimization
○ Reducing overfitting
○ Initialization
● 2B.3 MNIST: Semi-supervised Image Recognition (PyTorch) (programming)
NVIDIA DLI Qwiklab 3: Image Segmentation with TensorFlow (link)
Module 4: Energy-based Learning
Lecture Slides
● 4.1 Energy-based Learning
● 4.2 Unsupervised Learning
● 4.3 Sparse Coding
Lab 3
● 3.1 Generative Adversarial Networks
● 3.2 GAN Workhouse (PyTorch) (programming)
Module 5: Optimization Techniques
Lecture Slides
● 5.1 Efficient Learning and Second-order Methods
Module 6: Learning with Memory
Lecture Slides
● 6.1 Recurrent Neural Network Basics
● 6.2 Advanced Recurrent Neural Networks
● 6.3 Sequential Data Modeling
● 6.4 Embedding Methods for NLP: Unsupervised and Supervised Embeddings
● 6.5 Embedding Methods for NLP: Embeddings for Multi-relational Data
● 6.6 Deep Natural Language Processing
Lab 4A
● 4A.1 nngraph (Torch) (programming)
● 4A.2 Language Modeling (Torch) (programming)
2
Lab 4B
● 4B.1 Batch Normalization
● 4B.2 Convolution
● 4B.3 Variants of Pooling
● 4B.4 t-SNE
● 4B.5 Sentence Classification
● 4B.6 Language Modeling (PyTorch) (programming)
Module 7: Future Challenges
Lecture Slides
● 7.1 Future Challenges
Quiz/exam Sample Problem Sets
Quiz/exam Sample Problem Set/solution 1
● 1.1 General Questions
● 1.2 Softmax regression
● 1.3 Chain Rule
● 1.4 Variants of Pooling
● 1.5 Convolution
● 1.6 Optimization
● 1.7 Top-k Error
● 1.8 t-SNE
● 1.9 Proximal Gradient Descent
Quiz/exam Sample Problem Set 2
● 2.1 Quick Basic Knowledge Questions
● 2.2 Multinomial Logistic Regression
● 2.3 Metric Learning with NCA
● 2.4 Sparse Coding
● 2.5 Convolutional Networks
● 2.6 Dataset Features
● 2.7 Backpropagation
● 2.8 More Backpropagation
● 2.9 Convergence of Linear Regression
Quiz/exam Sample Problem Set 3
● 3.1 General Questions
● 3.2 Sort module
● 3.3 Softmax
● 3.4 Shared Weights
● 3.5 ConvNet
● 3.6 Energy-Based Learning
3
● 3.7 Sparse Coding
● 3.8 Auto-Encoders
● 3.9 Optimization
● 3.10 Optimization in Multi-layer Nets
Quiz/exam Sample Problem Set 4
● 4.1 General Questions
● 4.2 Pooling
● 4.3 ConvNet Basics
● 4.4 ConvNets: Object Detection
● 4.5 ConvNets: Weak Supervision
● 4.6 Word, Text, and Image Embedding
● 4.7 Recurrent Nets
● 4.8 Energy-Based Learning: Weakly Supervised Object Localization
● 4.9 Unsupervised Learning and Auto-Encoders
● 4.10 Optimization
4