Deep Learning BCS714A
Course Code: BCS714A
Credits: 03
CIE Marks: 50
SEE Marks: 50
Total Marks: 100
Exam Hours: 03
Total Hours of Pedagogy: 40H
Teaching Hours/Weeks: [L:T:P:S] 3:0:0:0
Introducing Deep Learning: Biological and Machine Vision: Biological Vision, Machine Vision:
The Neocognitron, LeNet-5, The Traditional Machine Learning Approach, ImageNet and the
ILSVRC, AlexNet, TensorFlow Playground.
Human and Machine Language: Deep Learning for
Natural Language Processing: Deep Learning Networks Learn Representations Automatically,
Natural Language Processing, A Brief History of Deep Learning for NLP, Computational
Representations of Language: One-Hot Representations of Words, Word Vectors, Word-Vector
Arithmetic, word2viz, Localist Versus Distributed Representations, Elements of Natural Human
Language.
Regularization for Deep Learning: Parameter Norm Penalties, Norm Penalties as Constrained
Optimization, Regularization and Under-Constrained Problems, Dataset Augmentation, Noise Robustness,
Semi- Supervised Learning, Multi-Task Learning, Early Stopping, Parameter Tying and Parameter Sharing,
Sparse Representations.
Optimization for Training Deep Models: How Learning Differs from Pure
Optimization, Basic Algorithms. Parameter Initialization Strategies, Algorithms with Adaptive Learning Rates
Convolution neural networks: The Convolution Operation, Motivation, Pooling, Convolution and Pooling as an Infinitely Strong Prior, Variants of the Basic Convolution Function, Structured Outputs, Data Types, Efficient Convolution Algorithms, Convolutional Networks and the History of Deep Learning.
Sequence Modelling: Recurrent and Recursive Nets: Unfolding Computational Graphs, Recurrent Neural Networks, Bidirectional RNNs, Encoder-Decoder Sequence-to-Sequence Architectures, Deep Recurrent Networks, Recursive Neural Networks. Long short-term memory.
Interactive Applications of Deep Learning: Natural Language Processing: Preprocessing Natural Language Data: Tokenization, Converting All Characters to Lowercase, Removing Stop Words and Punctuation, Stemming, Handling n-grams, Preprocessing the Full Corpus, Creating Word Embeddings with word2vec: The Essential Theory Behind word2vec, Evaluating Word Vectors, Running word2vec, Plotting Word Vectors, The Area under the ROC Curve: The Confusion Matrix, Calculating the ROC AUC Metric, Natural Language Classification with Familiar Networks: Loading the IMDb Film Reviews, Examining the IMDb Data, Standardizing the Length of the Reviews, Dense Network, Convolutional Networks, Networks Designed for Sequential Data: Recurrent Neural Networks, Long Short-Term Memory Units, Bidirectional LSTMs, Stacked Recurrent Models, Seq2seq and Attention, Transfer Learning in NLP.