CS 4803 / 7643 Deep Learning

Fall 2018, TR 4:30 - 5:45 pm, CCB 16

Course Information

This is an exciting time to be studying (Deep) Machine Learning, or Representation Learning, or for lack of a better term, simply Deep Learning!

Deep Learning is rapidly emerging as one of the most successful and widely applicable set of techniques across a range of domains (vision, language, speech, reasoning, robotics, AI in general), leading to some pretty significant commercial success and exciting new directions that may previously have seemed out of reach.

This course will introduce students to the basics of Neural Networks (NNs) and expose them to some cutting-edge research. It is structured in modules (background, Convolutional NNs, Recurrent NNs, Deep Reinforcement Learning, Deep Structured Prediction). Modules will be presented via instructor lectures and reinforced with homeworks that teach theoretical and practical aspects. The course will also include a project which will allow students to explore an area of Deep Learning that interests them in more depth.

Dhruv Batra

Teaching Assistants
Nirbhay Modhe Office Hours Wednesdays 5pm-6pm CCB 2nd floor
Erik Wijmans Office Hours Tuesdays 12pm-1pm CCB 2nd floor
Harsh Agrawal Office Hours Mondays 3pm-4pm CCB 2nd floor
Michael Cogswell Office Hours Mondays 5pm-6pm CCB 2nd floor

Class meets
Tuesday, Thursday 4:30 - 5:45 pm, College of Computing Building Classroom 16

Staff Mailing List


Date Topic Optional Reading
W1: Aug 21 Class Administrativia
HW0 out Slides (pptx), Slides (pdf).
W1: Aug 23 Image Classification and k-NN
Slides, Slides (annotated).
Supervised Learning notes, k-NN notes.
W2: Aug 28 Linear Classifiers, Loss Functions.
Slides, Slides (annotated).
W2: Aug 30 Regularization, Neural Networks.
Slides, Slides (annotated).
W3: Sep 4 Optimization, Computational Flow Graphs, and Backprop.
Slides, Slides (annotated).
W3: Sep 6 Guest Lecture by Peter Anderson: Training Neural Networks 1
HW0 due 09/05 11:55pm.
W4: Sep 11 HW1 out
No class. ECCV.
W4: Sep 13 No class. ECCV.
W5: Sep 18 Forward mode vs Reverse mode Auto-diff.
Slides, Slides (annotated).
W5: Sep 20 Convolutional Neural Networks (CNNs).
W6: Sep 25 De/Up/Transposed Convolution; Convolution as Toeplitz Matrices; Dilated/Atrous Convolution
W6: Sep 27 Guest Lecture by Peter Anderson: CNN Architectures
W7: Oct 2 Segmentation and Detection CNNs (and Other Pixel-level Prediction); Different Architectures
HW1 due
HW2 out
W7: Oct 4 Visualizing CNNs
W8: Oct 9 Fall Student Recess
W8: Oct 11 Guest Lecture by Zhile Ren on 3D CNNs (PointNet, PointNet++, SlatNet, etc.)
W9: Oct 16 Recurrent Neural Networks (RNNs)
W9: Oct 18 Recurrent Neural Networks 2 (RNNs + CNNs)
HW2 due
HW3 out
W10: Oct 23 Reinforcement Learning (RL) Background
W10: Oct 25 Deep RL
W11: Oct 30 Guest Lecture by Peter Anderson on Vision + Language + Action (CNNs + RNNs + RL)
W11: Nov 1 Deep RL
W12: Nov 6 Probabilistic Graphical Models (PGMs)
HW3 due
W12: Nov 8 Probabilistic Graphical Models (PGMs)
W13: Nov 14 No class. CVPR Deadline
W13: Nov 16 No class. CVPR Deadline
W14: Nov 20 Deep PGMs: Variational Autoencoders (VAEs)
W14: Nov 22 No class. Thanksgiving break
W15: Nov 27 Generative Adversarial Networks (GANs)
W15: Nov 29 Class Project Poster Presentations
W16: Dec 4 No class. NIPS


  • 80% Homework (4 homeworks)
  • 20% Final Project
  • 5% (potential bonus) Class Participation

Late policy for deliverables

  • No penalties for medical reasons or emergencies. Please see GT Catalog for rules about contacting the office of the Dean of Students.
  • Every student has 7 free late days (7 x 24-hour chunks) for this course.
  • After all free late days are used up, penalty is 25% for each additional late day.


CS 4803/7643 should not be your first exposure to machine learning. Ideally, you need:

  • Intro-level Machine Learning
    • CS 3600 for the undergraduate section and CS 7641/ISYE 6740/CSE 6740 or equivalent for the graduate section.
  • Algorithms
    • Dynamic programming, basic data structures, complexity (NP-hardness)
  • Calculus and Linear Algebra
    • positive semi-definiteness, multivariate derivates (be prepared for lots and lots of gradients!)
  • Programming
    • This is a demanding class in terms of programming skills.
    • HWs will involve a mix of languages (Python, C++) and libraries (PyTorch).
    • Your language of choice for project.
  • Ability to deal with abstract mathematical concepts


  • The class is full. Can I still get in?

    Sorry. The course admins in IC control this process. Please talk to them.

  • Can I audit this class or take it pass/fail?

    No. Due to the large demand for this class, we will not be allowing audits or pass/fail. Letter grades only. This is to make sure students who want to take the class for credit can.

  • Can I simply sit in the class (no credits)?

    In general, we welcome members of the Georgia Tech community (students, staff, and/or faculty) to sit-in. Out of courtesy, we would appreciate if you let us know beforehand (via email or in person). If the classroom is full, we would ask that you please allow registered students to attend.

  • I have a question. What is the best way to reach the course staff?

    Registered students – your first point of contact is Piazza (so that other students may benefit from your questions and our answers). If you have a personal matter, email us at the class mailing list cs4803-7643-f18-staff@googlegroups.com


  • You are encouraged to try out interesting applications of deep learning (vision, NLP, computational biology, UAVs, etc!)
  • The project must be done in this semester.
  • You may combine with other course project but must delineate the different parts.
  • Extra credit for shooting for a publication
  • Main Categories:
    • Application/survey - compare a bunch of algorithms on a new application domain of your interest
    • Formulation/Development - Formulate a new model or algorithm for a new/old problem
    • Theory - Theoretically analyze an existing algorithm.

Note to people outside Georgia Tech

Feel free to use the slides and materials available online here. If you use our slides, an appropriate attribution is requested. Please email the instructor with any corrections or improvements.