SlideShare a Scribd company logo
2
Most read
3
Most read
Course Code Course Title L T P U
CSE431 DEEP LEARNING 3 0 0 3
Course Objectives: The student learns various state-of-the-art deep learning algorithms and their applications
to solve real-world problems. The student develops skills to design neural network architectures and training
procedures using various deep learning platforms and software libraries.
Course Learning Outcomes:
On completing this course, the student will be able to:
CO1: describe the feedforward and deep networks.
CO2:design single and multi-layer feed-forward deep networks and tune various hyper-parameters.
CO3: analyze the performance of deep networks.
Module-1
Introduction to machine learning-
Linear models (SVMs and Perceptron’s, logistic regression)- Intro to Neural Nets: What a shallow network
computes- Training a network: loss functions, back propagation and stochastic gradient descent- Neural
networks as universal function approximates
Unit-II
Historical context and motivation for deep learning; basic supervised classification task, optimizing logistic
classifier using gradient descent, stochastic gradient descent, momentum, and adaptive sub gradient method.
Feed forward neural networks, deep networks, regularizing a deepnetwork, model exploration, and hyper
parameter tuning.
Unit-III
Convolution Neural Networks: Introduction to convolution neural networks: stacking, striding and pooling,
applications like image, and text classification.
Unit-IV
Sequence Modeling: Recurrent Nets: Unfolding computational graphs, recurrent neural networks (RNNs),
bidirectional RNNs, encoder-decoder sequence to sequence architectures, deep recurrent networks.GAN
Unit-V
Auto encoders: Under complete auto encoders, regularized auto encoders, sparse auto encoders, denoising auto
encoders, representational power, layer, size, and depth of auto encoders, stochastic encoders and decoders.
LSTM
Text book(s):
T1. Ian Goodfellow, Deep Learning, MIT Press, 2016.
T2. Jeff Heaton, Deep Learning and Neural Networks, Heaton Research Inc, 2015.
Reference Book:
R1. Mindy L Hall, Deep Learning, VDM Verlag,2011.
R2. Li Deng (Author), Dong Yu, Deep Learning: Methods and Applications (Foundations and Trends in Signal
Processing), Now Publishers Inc, 2009.
Lecture-wise plan:
Lecture
No.
Learning objective Topics to be covered
Reference (Sec.
No. of Text /Ref
Books)
1 - 2
Introduction to machine
learning
Different ML methods
T1: Page:1-8
3-10
Intro to Neural Nets: What a
shallow network computes-
Training a network: loss
functions, backpropagation
and stochastic gradient
descent- Neural networks as
universal function
approximates
Intro to Neural Nets: What a
shallow network computes-
Training a network: loss
functions, backpropagation
and stochastic gradient
descent- Neural networks as
universal function
approximates
R1: Page:30-42
T2: Page:20-35
11- 15
Historical context and
motivation for deep learning;
basic supervised
classification task,
optimizing logistic classifier
using gradient descent
Historical context and
motivation for deep learning;
basic supervised
classification task,
optimizing logistic classifier
using gradient descent
T1: Page:22-34
15- 19
momentum, and adaptive
sub gradient method.
momentum, and adaptive
sub gradient method. T2: Page:51-62
20 - 29
Convolution Neural
Networks: Introduction to
convolution neural networks
stacking, striding and
pooling, applications like
image, and text
classification.
R2: Page:153-167
R1: pages:46-65
30 - 35
Sequence Modeling:
Recurrent Nets: Unfolding
computational graphs,
recurrent neural networks
(RNNs),
Sequence Modeling:
Recurrent Nets: Unfolding
computational graphs,
recurrent neural networks
(RNNs),
T1: Page:37-49
36-39
bidirectional RNNs,
encoder-decoder sequence to
sequence architectures, deep
recurrent networks
bidirectional RNNs,
encoder-decoder sequence to
sequence architectures, deep
recurrent networks
T1: Page: 70-89
40 -45
Autoencoders:
Undercompleteautoencoders,
regularized autoencoders,
sparse autoencoders,
denoisingautoencoders,
representational power,
layer, size, and depth of
autoencoders, stochastic
encoders and decoders
Autoencoders:
Undercompleteautoencoders,
regularized autoencoders,
sparse autoencoders,
denoisingautoencoders,
representational power,
layer, size, and depth of
autoencoders, stochastic
encoders and decoders
T1: Page :95-120
Evaluation Scheme:
Component Duration
Weightage
(%)
Remarks
Internal I 25
Mid Term Exam 2 hours 20 Closed Book
Internal II 25
Comprehensive
Exam
3 hours 30 Closed Book
1. Attendance Policy: A Student must normally maintain a minimum of 75% attendance in the course
without which he/she shall be disqualified from appearing in the respective examination.
2. Make-up Policy: A student, who misses any component of evaluation for genuine reasons, must
immediately approach the instructor with a request for make-up examination stating reasons. The decision of
the instructor in all matters of make-up shall be final.
3. Chamber Consultation Hours: During the Chamber Consultation Hours, the students can consult the
respective faculty in his/her chamber without prior appointment.

More Related Content

PPTX
Deep learning.pptx
PPTX
AD3501_Deep_Learning_PRAISE_updated.pptx
PPTX
Introduction to deep learning
PDF
Introduction to ML (BSc): deep leaning--
PDF
Deep learning
PPTX
Deep_Learning_Introduction for newbe.pptx
PPTX
DL_L1__course_overview.pptx
PDF
Animesh Prasad and Muthu Kumar Chandrasekaran - WESST - Basics of Deep Learning
Deep learning.pptx
AD3501_Deep_Learning_PRAISE_updated.pptx
Introduction to deep learning
Introduction to ML (BSc): deep leaning--
Deep learning
Deep_Learning_Introduction for newbe.pptx
DL_L1__course_overview.pptx
Animesh Prasad and Muthu Kumar Chandrasekaran - WESST - Basics of Deep Learning

Similar to DEEP LEARNING.docx (20)

PPTX
How to architect Deep Learning
PDF
unit 1- NN concpts.pptx.pdf withautomstion
PPTX
Deep_Learning_Algorithms_Presentation.pptx
PDF
Leature notes for English to the students
PDF
lecture_1_2_ruohan.pdf
PDF
Introduction to Deep Learning: Concepts, Architectures, and Applications
PPTX
Introduction to Deep Learning and Machine Learning.pptx
PPTX
Deep Learning With Neural Networks
PDF
A Platform for Accelerating Machine Learning Applications
PDF
Deep learning: Cutting through the Myths and Hype
PPTX
Introduction to Deep Learning and ML.pptx
PPTX
Introduction to Deep Learning and ML.pptx
PDF
Hardware Acceleration for Machine Learning
PPT
deep learning UNIT-1 Introduction Part-1.ppt
PPTX
Deep_Learning_Demo_Class_Detailed.pptx sn
PPTX
Deep learning notes.pptx
PPTX
1. Introduction to deep learning.pptx
PPT
Notes from 2016 bay area deep learning school
PPT
Introduction_to_DEEP_LEARNING.ppt machine learning that uses data, loads ...
How to architect Deep Learning
unit 1- NN concpts.pptx.pdf withautomstion
Deep_Learning_Algorithms_Presentation.pptx
Leature notes for English to the students
lecture_1_2_ruohan.pdf
Introduction to Deep Learning: Concepts, Architectures, and Applications
Introduction to Deep Learning and Machine Learning.pptx
Deep Learning With Neural Networks
A Platform for Accelerating Machine Learning Applications
Deep learning: Cutting through the Myths and Hype
Introduction to Deep Learning and ML.pptx
Introduction to Deep Learning and ML.pptx
Hardware Acceleration for Machine Learning
deep learning UNIT-1 Introduction Part-1.ppt
Deep_Learning_Demo_Class_Detailed.pptx sn
Deep learning notes.pptx
1. Introduction to deep learning.pptx
Notes from 2016 bay area deep learning school
Introduction_to_DEEP_LEARNING.ppt machine learning that uses data, loads ...
Ad

Recently uploaded (20)

PDF
Machine learning based COVID-19 study performance prediction
PDF
Network Security Unit 5.pdf for BCA BBA.
PDF
gpt5_lecture_notes_comprehensive_20250812015547.pdf
PDF
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
PDF
Encapsulation theory and applications.pdf
PDF
A comparative analysis of optical character recognition models for extracting...
PDF
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
PDF
Assigned Numbers - 2025 - Bluetooth® Document
PDF
Chapter 3 Spatial Domain Image Processing.pdf
PPTX
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
PPTX
sap open course for s4hana steps from ECC to s4
DOCX
The AUB Centre for AI in Media Proposal.docx
PDF
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
PDF
Empathic Computing: Creating Shared Understanding
PDF
Electronic commerce courselecture one. Pdf
PDF
Profit Center Accounting in SAP S/4HANA, S4F28 Col11
PDF
Review of recent advances in non-invasive hemoglobin estimation
PPTX
Digital-Transformation-Roadmap-for-Companies.pptx
PDF
MIND Revenue Release Quarter 2 2025 Press Release
PPTX
MYSQL Presentation for SQL database connectivity
Machine learning based COVID-19 study performance prediction
Network Security Unit 5.pdf for BCA BBA.
gpt5_lecture_notes_comprehensive_20250812015547.pdf
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
Encapsulation theory and applications.pdf
A comparative analysis of optical character recognition models for extracting...
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
Assigned Numbers - 2025 - Bluetooth® Document
Chapter 3 Spatial Domain Image Processing.pdf
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
sap open course for s4hana steps from ECC to s4
The AUB Centre for AI in Media Proposal.docx
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
Empathic Computing: Creating Shared Understanding
Electronic commerce courselecture one. Pdf
Profit Center Accounting in SAP S/4HANA, S4F28 Col11
Review of recent advances in non-invasive hemoglobin estimation
Digital-Transformation-Roadmap-for-Companies.pptx
MIND Revenue Release Quarter 2 2025 Press Release
MYSQL Presentation for SQL database connectivity
Ad

DEEP LEARNING.docx

  • 1. Course Code Course Title L T P U CSE431 DEEP LEARNING 3 0 0 3 Course Objectives: The student learns various state-of-the-art deep learning algorithms and their applications to solve real-world problems. The student develops skills to design neural network architectures and training procedures using various deep learning platforms and software libraries. Course Learning Outcomes: On completing this course, the student will be able to: CO1: describe the feedforward and deep networks. CO2:design single and multi-layer feed-forward deep networks and tune various hyper-parameters. CO3: analyze the performance of deep networks. Module-1 Introduction to machine learning- Linear models (SVMs and Perceptron’s, logistic regression)- Intro to Neural Nets: What a shallow network computes- Training a network: loss functions, back propagation and stochastic gradient descent- Neural networks as universal function approximates Unit-II Historical context and motivation for deep learning; basic supervised classification task, optimizing logistic classifier using gradient descent, stochastic gradient descent, momentum, and adaptive sub gradient method. Feed forward neural networks, deep networks, regularizing a deepnetwork, model exploration, and hyper parameter tuning. Unit-III Convolution Neural Networks: Introduction to convolution neural networks: stacking, striding and pooling, applications like image, and text classification. Unit-IV Sequence Modeling: Recurrent Nets: Unfolding computational graphs, recurrent neural networks (RNNs), bidirectional RNNs, encoder-decoder sequence to sequence architectures, deep recurrent networks.GAN Unit-V Auto encoders: Under complete auto encoders, regularized auto encoders, sparse auto encoders, denoising auto encoders, representational power, layer, size, and depth of auto encoders, stochastic encoders and decoders. LSTM Text book(s): T1. Ian Goodfellow, Deep Learning, MIT Press, 2016. T2. Jeff Heaton, Deep Learning and Neural Networks, Heaton Research Inc, 2015. Reference Book: R1. Mindy L Hall, Deep Learning, VDM Verlag,2011. R2. Li Deng (Author), Dong Yu, Deep Learning: Methods and Applications (Foundations and Trends in Signal Processing), Now Publishers Inc, 2009.
  • 2. Lecture-wise plan: Lecture No. Learning objective Topics to be covered Reference (Sec. No. of Text /Ref Books) 1 - 2 Introduction to machine learning Different ML methods T1: Page:1-8 3-10 Intro to Neural Nets: What a shallow network computes- Training a network: loss functions, backpropagation and stochastic gradient descent- Neural networks as universal function approximates Intro to Neural Nets: What a shallow network computes- Training a network: loss functions, backpropagation and stochastic gradient descent- Neural networks as universal function approximates R1: Page:30-42 T2: Page:20-35 11- 15 Historical context and motivation for deep learning; basic supervised classification task, optimizing logistic classifier using gradient descent Historical context and motivation for deep learning; basic supervised classification task, optimizing logistic classifier using gradient descent T1: Page:22-34 15- 19 momentum, and adaptive sub gradient method. momentum, and adaptive sub gradient method. T2: Page:51-62 20 - 29 Convolution Neural Networks: Introduction to convolution neural networks stacking, striding and pooling, applications like image, and text classification. R2: Page:153-167 R1: pages:46-65 30 - 35 Sequence Modeling: Recurrent Nets: Unfolding computational graphs, recurrent neural networks (RNNs), Sequence Modeling: Recurrent Nets: Unfolding computational graphs, recurrent neural networks (RNNs), T1: Page:37-49
  • 3. 36-39 bidirectional RNNs, encoder-decoder sequence to sequence architectures, deep recurrent networks bidirectional RNNs, encoder-decoder sequence to sequence architectures, deep recurrent networks T1: Page: 70-89 40 -45 Autoencoders: Undercompleteautoencoders, regularized autoencoders, sparse autoencoders, denoisingautoencoders, representational power, layer, size, and depth of autoencoders, stochastic encoders and decoders Autoencoders: Undercompleteautoencoders, regularized autoencoders, sparse autoencoders, denoisingautoencoders, representational power, layer, size, and depth of autoencoders, stochastic encoders and decoders T1: Page :95-120 Evaluation Scheme: Component Duration Weightage (%) Remarks Internal I 25 Mid Term Exam 2 hours 20 Closed Book Internal II 25 Comprehensive Exam 3 hours 30 Closed Book 1. Attendance Policy: A Student must normally maintain a minimum of 75% attendance in the course without which he/she shall be disqualified from appearing in the respective examination. 2. Make-up Policy: A student, who misses any component of evaluation for genuine reasons, must immediately approach the instructor with a request for make-up examination stating reasons. The decision of the instructor in all matters of make-up shall be final. 3. Chamber Consultation Hours: During the Chamber Consultation Hours, the students can consult the respective faculty in his/her chamber without prior appointment.