CAP5610 Machine Learning
Term: Spring 2017
Time: Th 6-8:50pm
Room: Rm726, Office of Continuing Education, Innovative Center (3280 Progress Drive, Suite 700, Orlando, FL 32826)
Instructor: Guo-Jun Qi, email@example.com
Calculus, linear algebra and probability theory
In this course, we will begin with the basic concepts in machine learning, in the context of several classic topics from supervised learning (classification), unsupervised learning (model fitting, clustering) to feature learning, dimensionality reduction etc. We will also discuss advanced topics on learning theory, graphical models, and dynamic Bayesian model if time allows.
Required textbook: Pattern Recognition and Machine Learning (PRML), C. Bishop, Springer, 2006.
Other suggested books:
Machine Learning, T. Mitchell, McGraw-Hill, 1997.
The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd Edition, T. Hastie, R. Tibshirani, J. Friedman, Springer, 2011.
Probabilistic Graphical Models: Principles and Techniques, D. Koller, and N. Friedman, The MIT Press, 2009.
Covered topics (subject to change):
2. Naive Bayesian classifier
3. Linear regression and classification
4. Support vector machine and kernel method
5. Neural Networks and back propagation
6. Unsupervised learning problems: clustering, PCA, LDA, CCA etc.
7. Modeling fitting and EM algorithm
8. Graphical model: Baysian networks, Markov random fields, approximation inference, variational method, sampling, loopy belief propagation
9. Advanced topics: matrix factorization, metric learning, latent models, online learning, active learning, latent models, sparse coding, nonparametric Bayesian model etc. (will cover as many as possible if time allows)
Homework and machine problems will account for 30% of the grade. There will be two exams (50%) and a final project (20%). There is no final exam.
Students can perform individual final project, or form groups of up to three people to collaborate on the final project. A final project can be a survey (paper review) on a certain machine learning topic, or a proposed project to solve a machine learning problem. Instructor will also suggest some topics for the project. A project proposal shall be submitted to instructor for review before it is approved. The proposal must specify who will be joining in the project, the roles of each member, and description of the proposed survey topic or problem. Project report will be graded. Project presentation will be arranged if time allows.
The course is not intended to teach a programming language, so students can use any languages to solve the machine problems (e.g., C, C++, Java, Matlab, Python, Octave). You are required to submit a report, which shall include a brief description of how you implement the algorithm in the language you choose, the parameter setting, your test protocol, as well as the result you obtain. Source codes shall be submitted along with the report. Submit reports and source codes to firstname.lastname@example.org.
· January 12, Lecture 01: Introduction to Machine Learning [ pdf ]
Reading assignment: Sec 1.2, Sec 1.4 of PRML.
· January 19, Lecture 02: Review of Probability Theory [ pdf ]
Reading assignment Chap 2 and Appendix B of PRML.
Reading assignment Section 1.5 of PRML.
· February 2, Lecture 04: Logistic Regression [ pdf ]
Chapter 4 of PRML
· February 9, Lecture 05: Linear Regression [ pdf ]
Chapter 4 of PRML
· February 16, Lecture 06: Support Vector Machines [ pdf ]
· February 23, Lecture 07: Support Vector Machines II [ pdf ]
· March 2, Lecture 08: Neural Networks [ pdf ]
· March 9, Lecture 09: Deep Learning [ pdf ]
· April 6, Lecture 10: Dimensionality Reduction [ pdf ]
· April 20, Lecture 12: Boosting [ pdf ]