Spring 2023 - CMPT 410 E100

Machine Learning (3)

Class Number: 7908

Delivery Method: In Person


  • Course Times + Location:

    Jan 4 – Apr 11, 2023: Wed, 5:30–6:20 p.m.

    Jan 4 – Apr 11, 2023: Fri, 4:30–6:20 p.m.

  • Exam Times + Location:

    Apr 16, 2023
    Sun, 7:00–10:00 p.m.

  • Prerequisites:

    CMPT 310 and MACM 316, both with a minimum grade of C-.



Machine Learning (ML) is the study of computer algorithms that improve automatically through experience. This course introduces students to the theory and practice of machine learning, and covers mathematical foundations, models such as (generalized) linear models, kernel methods and neural networks, loss functions for classification and regression, and optimization methods. Students with credit for CMPT 419 under the title "Machine Learning" may not take this course for further credit.


Machine learning is the study of computer algorithms that improve automatically through experience, which play an increasingly important role in artificial intelligence, computer science and beyond. The goal of this course is to introduce students to machine learning, starting from the foundations and gradually building up to modern techniques. Students in the course will learn about the theoretical underpinnings, modern applications and software tools for applying deep learning. This course is intended to be an introductory course for students interested in conducting research in machine learning or applying machine learning, and should prepare students for more advanced courses, such as CMPT 727 and CMPT 728. No previous knowledge of machine learning is assumed, but students are expected to have solid background in calculus, linear algebra, probability and programming using Python.


  • Mathematical foundations: review of linear algebra, multivariate calculus and probability
  • (Generalized) linear models: linear regression, ridge regression, logistic regression
  • Non-linear models: support vector machines, neural networks, k-nearest neighbours
  • Regression, binary classification, multinomial classification
  • Optimization: gradient descent, stochastic gradient descent, Lagrangian duality



The course grade will be based on homework assignments and exam.



Reference Books:
Machine Learning: A Probabilistic Perspective, Kevin P. Murphy, MIT Press, 2012, 9780262018029

The Elements of Statistical Learning, Trevor Hastie, Robert Tibshirani, and Jerome Friedman, Springer-Verlag, 2009, 9780387848570

All of Statistics, Larry Wasserman, Springer, 2010, 9781441923226

Pattern Recognition and Machine Learning, Christopher M. Bishop, Springer, 2006, 9780387310732

Machine Learning, Tom Mitchell, McGraw Hill, 1997, 9780070428072


Your personalized Course Material list, including digital and physical textbooks, are available through the SFU Bookstore website by simply entering your Computing ID at: shop.sfu.ca/course-materials/my-personalized-course-materials.

Registrar Notes:


SFU’s Academic Integrity website http://www.sfu.ca/students/academicintegrity.html is filled with information on what is meant by academic dishonesty, where you can find resources to help with your studies and the consequences of cheating. Check out the site for more information and videos that help explain the issues in plain English.

Each student is responsible for his or her conduct as it affects the university community. Academic dishonesty, in whatever form, is ultimately destructive of the values of the university. Furthermore, it is unfair and discouraging to the majority of students who pursue their studies honestly. Scholarly integrity is required of all members of the university. http://www.sfu.ca/policies/gazette/student/s10-01.html