Fall 2019 - CMPT 726 G100
Machine Learning (3)
Class Number: 9019
Delivery Method: In Person
Machine Learning is the study of computer algorithms that improve automatically through experience. Provides students who conduct research in machine learning, or use it in their research, with a grounding in both the theoretical justification for, and practical application of, machine learning algorithms. Covers techniques in supervised and unsupervised learning, the graphical model formalism, and algorithms for combining models. Students who have taken CMPT 882 (Machine Learning) in 2007 or earlier may not take CMPT 726 for further credit.
Machine Learning is the study of computer algorithms that improve automatically through experience. Machine learning algorithms play an important role in industrial applications and commercial data analysis. The goal of this course is to present students with both the theoretical justification for and practical application of, machine learning algorithms. Students in the course will gain hands-on experience with major machine learning tools and their applications to real-world data sets. This course will cover techniques in supervised and unsupervised learning, neural networks / deep learning, the graphical model formalism, and algorithms for combining models. This course is intended for graduate students who are interested in machine learning or who conduct research in fields that use machine learning, such as computer vision, natural language processing, data mining, bioinformatics, and robotics. No previous knowledge of pattern recognition or machine learning concepts is assumed, but students are expected to have or obtain, background knowledge in mathematics and statistics.
- Graphical models: directed and undirected graphs
- Inference algorithms: junction tree, belief propagation, variational inference, Markov Chain Monte Carlo, Gibbs sampling
- Temporal models and algorithms: hidden Markov Models, Kalman filtering, particle filtering
- Classification: nearest neighbour, support vector machines, decision trees, naive Bayes, Fisher's linear discriminant
- Regression: linear regression, logistic regression, regularization
- Unsupervised learning: spectral clustering, kmeans
- Deep learning
The course grade will be based on homework assignments, a project, and exam.
MATERIALS + SUPPLIES:
- The Elements of Statistical Learning, Trevor Hastie, Robert Tibshirani, and Jerome Friedman, Springer-Verlag, 2009, 9780387848570
- Machine Learning, Tom Mitchell, McGraw Hill, 1997, 9780070428072
- Pattern Classification, 2nd Edition, Richard O. Duda, Peter E. Hart, and David G. Stork, Wiley Interscience, 2000, 9780471056690
- All of Statistics, Larry Wasserman, Springer, 2010, 9781441923226
Pattern Recognition and Machine Learning
Christopher M. Bishop
Graduate Studies Notes:
Important dates and deadlines for graduate students are found here: http://www.sfu.ca/dean-gradstudies/current/important_dates/guidelines.html. The deadline to drop a course with a 100% refund is the end of week 2. The deadline to drop with no notation on your transcript is the end of week 3.
SFU’s Academic Integrity web site http://www.sfu.ca/students/academicintegrity.html is filled with information on what is meant by academic dishonesty, where you can find resources to help with your studies and the consequences of cheating. Check out the site for more information and videos that help explain the issues in plain English.
Each student is responsible for his or her conduct as it affects the University community. Academic dishonesty, in whatever form, is ultimately destructive of the values of the University. Furthermore, it is unfair and discouraging to the majority of students who pursue their studies honestly. Scholarly integrity is required of all members of the University. http://www.sfu.ca/policies/gazette/student/s10-01.html
ACADEMIC INTEGRITY: YOUR WORK, YOUR SUCCESS