Spring 2020 - CMPT 980 G200

Special Topics in Computing Science (3)

Introduction to Deep Learning

Class Number: 6773

Delivery Method: In Person

Overview

  • Course Times + Location:

    Jan 6 – Apr 9, 2020: Mon, Wed, 3:30–4:50 p.m.
    Burnaby

  • Prerequisites:

    Instructor discretion.

Description

CALENDAR DESCRIPTION:

This course aims to give students experience to emerging important areas of computing science.

COURSE DETAILS:

Machine learning has become the main framework for building programs that perform intelligent tasks. In fields such as computer vision and natural language processing, many recent successes have been achieved using neural nets with several layers, known as deep neural nets. This course is an introduction to deep neural nets, techniques for training them from data, and significant applications. While general machine learning is not a prerequisite, the course will be difficult for students without sufficient preparation. The main learning outcomes are 1) for students to have sufficient practical experience with deep learning to apply current techniques to real-life problems (2) for students to have sufficient theoretical understand of deep neural nets to analyze and improve their performance.

COURSE-LEVEL EDUCATIONAL GOALS:

  • Training feedforward neural nets (backpropagation)
  • Advanced training topics, including: dropout, batch normalizations, step size adapation, hyperparame
  • Common architectures and their applications: convolutional neural networks, recurrent neural network
  • Embeddings (skip-gram models, graph neural networks)
  • Generative models: generative adversarial models, variational auto-encoders
  • Comparison of neural networks with other machine learning approaches (linear classifiers, kernel met
  • Adversarial attacks against neural networks
  • Interpreting neural networks

Grading

  • Assignments 30%
  • Exercises 10%
  • Quizzes 10%
  • Midterm Exam 20%
  • Final Exam 30%

NOTES:

Grading will be based on written assignments (3-5), homework exercises (3-5), quizzes, a midterm and a final. The main component of the assignments will be applying neural networks to datasets. Grading breakdown: * Assignments 30% * Exercises 10% * Quizzes 10% * Midterm Exam 20% * Final Exam 30%
Students must attain an overall passing grade on the weighted average of exams in the course in order to obtain a clear pass (C- or better).

Materials

MATERIALS + SUPPLIES:

  • Introduction to Deep Learning, Eugene Charniak, MIT Press, 2018, 9780262039512
  • Deep Learning, Goodfellow, Bengio, and Courville, MIT Press, 2016, 9780262035613, Available on-line at http://www.deeplearningbook.org

REQUIRED READING:

  • Introduction to Deep Learning, Eugene Charniak, MIT Press, 2018, 9780262039512

RECOMMENDED READING:

  • Deep Learning, Goodfellow, Bengio, and Courville, MIT Press, 2016, 9780262035613, Available on-line at http://www.deeplearningbook.org

Graduate Studies Notes:

Important dates and deadlines for graduate students are found here: http://www.sfu.ca/dean-gradstudies/current/important_dates/guidelines.html. The deadline to drop a course with a 100% refund is the end of week 2. The deadline to drop with no notation on your transcript is the end of week 3.

Registrar Notes:

SFU’s Academic Integrity web site http://www.sfu.ca/students/academicintegrity.html is filled with information on what is meant by academic dishonesty, where you can find resources to help with your studies and the consequences of cheating.  Check out the site for more information and videos that help explain the issues in plain English.

Each student is responsible for his or her conduct as it affects the University community.  Academic dishonesty, in whatever form, is ultimately destructive of the values of the University. Furthermore, it is unfair and discouraging to the majority of students who pursue their studies honestly. Scholarly integrity is required of all members of the University. http://www.sfu.ca/policies/gazette/student/s10-01.html

ACADEMIC INTEGRITY: YOUR WORK, YOUR SUCCESS