CS568 Deep Learning
Fall 2020
Nazar Khan
The ability of biological brains to sense, perceive, analyse and recognise patterns can only be described as stunning. Furthermore, they have the ability to learn from new examples. Mankind's understanding of how biological brains operate exactly is embarrassingly limited. However, there do exist numerous 'practical' techniques that give machines the 'appearance' of being intelligent. This is the domain of statistical pattern recognition and machine learning. Instead of attempting to mimic the complex workings of a biological brain, this course aims at explaining mathematically well-founded techniques for analysing patterns and learning from them.
Artificial Neural Networks as extremely simplified models of the human brain have existed for almost 75 years. However, the last 25 years have seen a tremendous unlocking of their potential. This progress has been a direct result of a collection of network architectures and training techniques that have come to be known as Deep Learning. As a result, Deep Learning has taken over its parent fields of Neural Networks, Machine Learning and Artificial Intelligence. Deep Learning is quickly becoming must-have knowledge in many academic disciplines as well as in the industry.
This course is a mathematically involved introduction into the wonderful world of deep learning. It will prepare students for further study/research in the areas of Pattern Recognition, Machine Learning, Computer Vision, Data Analysis, Natural Language Processing, Speech Recognition, Machine Translation, Autonomous Driving and other areas attempting to solve Artificial Intelligence (AI) type problems.
CS 568 is a graduate course worth 3 credit hours.
Lectures: Monday and Wednesday, 8:30 a.m. - 9:55 a.m. @ https://meet.google.com/stg-pjvm-vnb
Office Hours: Monday, 2:00 p.m. - 3:00 p.m. @ https://meet.google.com/njc-gvuy-wtj
Recitations: Friday, 8:30 a.m. - 10:00 a.m @ https://meet.google.com/kqu-rbny-acz
TA: Arbish Akram
Prerequisites
Books and Other Resources
No single book will be followed as the primary text. Helpful online and offline resources include:
Grades
Grading sheet (Accessible only through your PUCIT email account)
Lectures
# |
Date |
Topics |
Slides |
Videos |
Recitations |
Readings |
Miscellaneous |
1 |
January 18 |
|
Friday, January 22: Recitation 0
|
|
|
||
2 |
January 25 |
|
Friday, January 29: Recitation 1
|
|
|
||
3 |
February 1 |
|
|
|
Quiz 1 |
||
4 |
February 3 |
|
Friday, February 5: Recitation 2
|
|
|
||
5 |
February 8 |
|
Loss Functions and Activation Functions for Machine Learning |
|
|
Quiz 2 |
|
6 |
February 10 |
|
Friday, February 12: Recitation 3
|
|
Assignment 1
|
||
7 |
February 15 |
|
|
|
Quiz 3 |
||
8 |
February 17 |
|
Friday, February 19: Recitation 4
|
|
|||
9 |
February 22 |
|
|
Quiz 4 Assignment 2: Initialization and ADAM
|
|||
10 |
February 24 |
|
Friday, February 26: Recitation 5
|
|
|||
11 |
March 1 |
|
|
Quiz 5 |
|||
12 |
March 3 |
|
Friday, March 5: Recitation 6
|
|
Assignment 3: Regularization
|
||
13 |
March 8 |
|
|
Quiz 6 |
|||
14 |
March 10 |
|
Friday, March 12: Recitation 7
|
|
Assignment 4: CNN
|
||
15 |
March 15 |
|
|
||||
16 |
March 17 |
|
Friday, March 19: No Recitation |
|
|||
17 |
March 22 |
|
|
Quiz 7 (delayed from previous week) |
|||
18 |
March 24 |
|
Friday, March 26: Recitation 8
|
|
|||
19 |
March 29 |
|
|
Assignment 5: RNN
|
|||
20 |
March 31 |
|
Friday, April 2: Recitation 9
|
|
|||
21 |
April 5 |
|
|
||||
22 |
April 7 |
|
Friday, April 9: Recitation 10
|
|
|||
23 |
April 12 |
|
|
||||
24 |
April 14 |
|
No Recitation |
|
|||
25 |
April 19 |
|
|
||||
26 |
April 21 |
|
Friday, April 22: Recitation 11
|
|
|||
27 |
April 26 |
|
|
||||
28 |
April 28 |
|
|
||||
|
May 7 |
|