Spring 2023 - CMPT 726 G200
Machine Learning (3)
Class Number: 7907
Delivery Method: In Person
Course Times + Location:
We 5:30 PM – 6:20 PM
AQ 3150, Burnaby
Fr 4:30 PM – 6:20 PM
AQ 3150, Burnaby
Exam Times + Location:
Apr 16, 2023
7:00 PM – 10:00 PM
WMC 3260, Burnaby
Machine Learning is the study of computer algorithms that improve automatically through experience. Provides students who conduct research in machine learning, or use it in their research, with a grounding in both the theoretical justification for, and practical application of, machine learning algorithms. Covers techniques in supervised and unsupervised learning, the graphical model formalism, and algorithms for combining models. Students who have taken CMPT 882 (Machine Learning) in 2007 or earlier may not take CMPT 726 for further credit.
Machine learning is the study of computer algorithms that improve automatically through experience, which play an increasingly important role in artificial intelligence, computer science and beyond. The goal of this course is to introduce students to machine learning, starting from the foundations and gradually building up to modern techniques. Students in the course will learn about the theoretical underpinnings, modern applications and software tools for applying deep learning. This course is intended to be an introductory course for students interested in conducting research in machine learning or applying machine learning, and should prepare students for more advanced courses, such as CMPT 727 and CMPT 728. No previous knowledge of machine learning is assumed, but students are expected to have solid background in calculus, linear algebra, probability and programming using Python.
- Mathematical foundations: review of linear algebra, multivariate calculus and probability
- (Generalized) linear models: linear regression, ridge regression, logistic regression
- Non-linear models: support vector machines, neural networks, k-nearest neighbours
- Regression, binary classification, multinomial classification
- Optimization: gradient descent, stochastic gradient descent, Lagrangian duality
The course grade will be based on homework assignments and exam.
MATERIALS + SUPPLIES:
Machine Learning: A Probabilistic Perspective, Kevin P. Murphy, MIT Press, 2012, 9780262018029
The Elements of Statistical Learning, Trevor Hastie, Robert Tibshirani, and Jerome Friedman, Springer-Verlag, 2009, 9780387848570
All of Statistics, Larry Wasserman, Springer, 2010, 9781441923226
Pattern Recognition and Machine Learning, Christopher M. Bishop, Springer, 2006, 9780387310732
Machine Learning, Tom Mitchell, McGraw Hill, 1997, 9780070428072
REQUIRED READING NOTES:
Your personalized Course Material list, including digital and physical textbooks, are available through the SFU Bookstore website by simply entering your Computing ID at: shop.sfu.ca/course-materials/my-personalized-course-materials.
Graduate Studies Notes:
Important dates and deadlines for graduate students are found here: http://www.sfu.ca/dean-gradstudies/current/important_dates/guidelines.html. The deadline to drop a course with a 100% refund is the end of week 2. The deadline to drop with no notation on your transcript is the end of week 3.
ACADEMIC INTEGRITY: YOUR WORK, YOUR SUCCESS
SFU’s Academic Integrity website http://www.sfu.ca/students/academicintegrity.html is filled with information on what is meant by academic dishonesty, where you can find resources to help with your studies and the consequences of cheating. Check out the site for more information and videos that help explain the issues in plain English.
Each student is responsible for his or her conduct as it affects the university community. Academic dishonesty, in whatever form, is ultimately destructive of the values of the university. Furthermore, it is unfair and discouraging to the majority of students who pursue their studies honestly. Scholarly integrity is required of all members of the university. http://www.sfu.ca/policies/gazette/student/s10-01.html