Course Syllabus and Policies: Course handout.
- Instructor: Jimmy Ba
- Head TA: Jenny Bao
Please do not send the instructor or the TAs email about the class directly to their personal accounts.
Office hours: Instructor W 5 – 6 PT290D, TAs: TH 3–4 PT290C
Piazza: Students are encouraged to sign up Piazza to join course discussions. If your question is about the course material and doesn’t give away any hints for the homework, please post to Piazza so that the entire class can benefit from the answer.
Lecture hours: There are two sections of the course.
|Lecture||Lecture Room||Tutorial||Tutorial Room|
|Section 1||Tuesday 6-8||LM 161||Tuesday 8-9||LM 161|
|Section 2||Thursday 1-3||MC 102||Tuesday 12-1||ES 1050|
- Mar 30: Homework 5 handout is due Apr 6th. Additional TA office hours are Fri 6-7pm and Mon 3-4pm.
- Mar 25: Programming Assignment 4 handout is due Mar 31st. Additional TA office hours this week are Fri 6-7pm and Mon 3-4pm using Zoom Meeting. Please check your email for the link to the office hours.
- Mar 17: Homework 4 handout is due Mar 24th. Additional TA office hours this week are Thurs 3-4pm and Fri 6-7pm using Zoom Meeting. Please check your email for the link to the office hours.
- Mar 17: We are now moving all the lectures, the tutorials and the office hours online. Please check your email for the link to the online sessions.
- Mar 9: Programming Assignment 3 handout is due Mar
16th18th. Additional TA office hours this week are Thurs 3-4pm and Fri 6-7pm.
- Mar 5: Homework 3 deadline extended till Mar 6th.
- Feb 27: Homework 3 handout is due Mar
5th6th. Additional TA office hours this week are Mon 6-7pm and Thurs 3-4pm.
- Feb 12: Programming Assignment 2 handout starter code is due Feb 26th.
- Feb 04: Grad students in CSC2516 will work on a course project in place of the final exam due April 20th, there will be project consulation appointment soon.
- Feb 03: Homework 2 handout is due Feb
12th10th. Additional TA office hours this week are Thurs 3-4pm and Fri 6-7pm.
- Jan 27: Programming Assignment 1 handout starter code is due Feb 3rd.
- Jan 24: Both Tues noon tutorials will be merged and host at ES 1050 from now on.
- Jan 20: Homework 1 handout is due Jan 27th.
- Jan 14: Piazza is now avaliable.
It is very hard to hand design programs to solve many real world problems, e.g. distinguishing images of cats v.s. dogs. Machine learning algorithms allow computers to learn from example data, and produce a program that does the job. Neural networks are a class of machine learning algorithm originally inspired by the brain, but which have recently have seen a lot of success at practical applications. They’re at the heart of production systems at companies like Google and Facebook for image processing, speech-to-text, and language understanding. This course gives an overview of both the foundational ideas and the recent advances in neural net algorithms.
|Homework 1||Jan. 20(out), due Jan. 27|
|Programming Assignment 1||pdf, starter code||Jan. 27(out), due Feb. 03|
|Homework 2||Feb. 3(out), due Feb. 10|
|Programming Assignment 2||pdf, starter code||Feb. 12(out), due Feb. 26|
|Course Project (CSC2516 only)||project guideline||Proposal due Mar. 2, final report due April 20.|
|Homework 3||Feb. 27(out), due Mar. 5|
|Programming Assignment 3||pdf, starter code 1 and starter code 2||Mar. 9(out), due Mar. 18|
|Homework 4||Mar. 17(out), due Mar. 24|
|Programming Assignment 4||pdf, starter code 1 and starter code 2||Mar. 25(out), due Mar. 31|
|Homework 5||Mar. 30(out), due Apr. 6|
Midterm: Feb. 14, 6 - 8pm
|MS2170||A - G (CSC413 only)|
|MS2172||H - L (CSC413 only)|
|MS3153||M - Y (CSC413 only)|
|MS2173||Z - Z (CSC413 only)|
CSC413 and CSC2516 have different exam rooms.
The midterm will cover the lecture materials up to lecture 5, homework 1-2, and programming assignment 1. The questions will be of easier or similar difficulty to the homework and the programming assignments. It is a closed book exam; aid sheets are not allowed.
The format will be similar to the past midterms from CSC321 and CSC421:
Suggested readings included help you understand the course material. They are not required, i.e. you are only responsible for the material covered in lecture. Most of the suggested reading listed are more advanced than the corresponding lecture, and are of interest if you want to know where our knowledge comes from or follow current frontiers of research.
|Lecture 1||Jan 7/9||Introduction & Linear Models||Slides||Roger Grosse’s notes: Linear Regression, Linear Classifiers, Training a Classifier|
|Lecture 2||Jan 14/16||Multilayer Perceptrons & Backpropagation||Slides||Roger Grosse’s notes: Multilayer Perceptrons, Backpropagation|
|Tutorial 1||Jan 14||Multivariable Calculus Review||ipynb||Python notebook: ipynb, you may view the notebook via Colab.|
|Lecture 3||Jan 21/23||Automatic Differentiation & Distributed Representations||Slides||Roger Grosse’s notes: Automatic Differentiation, Distributed Representations|
|Tutorial 2||Jan 21||Autograd and PyTorch||ipynb|
|Lecture 4||Jan 28/30||Optimization||Slides||Roger Grosse’s notes: Optimization. Ian GoodFellow’s book: Chapter 8. Related papers: The effect of batch size, Comparison of optimizers for deep learning.|
|Tutorial 3||Jan 28||How to Train Neural Networks||slides, ipynb|
|Lecture 5||Feb 4/6||Convolutional Neural Networks and Image Classification||Slides||Roger Grosse’s notes: ConvNets, Image Classification. Related papers: Yann LeCun’s 1998 LeNet, AlexNet.|
|Tutorial 4||Feb 4||Convolutional Neural Networks||ipynb|
|Lecture 6||Feb 11/13||Interpretability||Slides||Related papers: Sanity Check for Saliency Maps, SmoothGrad, Towards a rigorous science of interpretable machine learning.|
|Tutorial 5||Feb 11||Midterm review||Slides|
|Midterm||Feb 14||Feb 14, 6 - 8 pm. See above for the details and and past midterms.|
|Lecture 7||Feb 25||Generalization and Recurrent Neural Networks||Slides||Roger Grosse’s notes: Generalization, RNNs, Exploding Vanishing Gradients. Related papers: Dropout, LSTM, ResNet.|
|Tutorial 6||Feb 25||Best Practices of ConvNet Applications||Slides|
|Lecture 8||Mar 3/5||Attention and Transformers||Slides||Related papers: Neural machine translation, Show, attend and tell, Transformers, BERT pre-training.|
|Tutorial 7||Mar 3||Recurrent Neural Networks||Slides|
|Tutorial 8||Mar 10||NLP and Transformers||Slides|
|Lecture 9||Mar 17/19||Autoregressive Models & Generative Adversarial Networks||Slides||Related papers: PixelRNNs, WaveNet, PixelCNNs, Generative adversarial networks, CycleGANs.|
|Tutorial 9||Mar 17||Information Theory||Slides|
|Lecture 10||Mar 24/26||Generative Models & Reinforcement Learning||Slides, VAEs (optional)||Related papers: RealNVP, Variational Auto-encoder, Policy Gradients for Robotics, Proximal Policy Optimization Algorithm.|
|Tutorial 10||Mar 24||Generative Adversarial Networks||Slides, ipynb|
|Lecture 11||Mar 31/Apr 2||Q-learning & the Game of Go||Slides||Related papers: Deep Q Network, AlphaGo, AlphaZero.|
|Tutorial 11||Mar 31||Policy Gradient and Reinforcement Learning||Slides, ipynb|
|Related Textbooks||Deep Learning (Goodfellow at al., 2016)||The Deep Learning textbook is a resource intended to help students and practitioners enter the field of machine learning in general and deep learning.|
|Information Theory, Inference, and Learning Algorithms (MacKay, 2003)||A good introduction textbook that combines information theory and machine learning.|
|General Framework||PyTorch||An open source deep learning platform that provides a seamless path from research prototyping to production deployment.|
|Computation Platform||Colab||Colaboratory is a free Jupyter notebook environment that requires no setup and runs entirely in the cloud.|
|GCE||Google Compute Engine delivers virtual machines running in Google’s innovative data centers and worldwide fiber network.|
|AWS-EC2||Amazon Elastic Compute Cloud (EC2) forms a central part of Amazon.com’s cloud-computing platform, Amazon Web Services (AWS), by allowing users to rent virtual computers on which to run their own computer applications.|