CS 7301: Advanced Machine Learning
Fall 2016Course Info
Where: ECSS 2.203
When: MW, 11:30am-12:45pm
Instructor: Nicholas Ruozzi
Office Hours: Tuesday 10am-11am and by appointment in ECSS 3.409
TA: Dhruv Patel
Office Hours: Tuesday & Thursday 4pm-5pm in the open lab ECSS 2.104.A1
Grading: problem sets (50%), midterm (20%), final (30%)
Attendance is MANDATORY. The instructor reserves the right to lower final grades as a result of poor attendance.
Prerequisites: some familiarity with basic probability, algorithms, multivariable calculus, and linear algebra.
Schedule & Lecture Slides
Week | Dates | Topic | Readings |
1 | Aug. 21 & 23 | Introduction & RegressionPerceptron | Bishop, Ch. 1 |
2 | Aug. 29 & Aug. 31 | Support Vector MachinesDuality & Kernel Methods | Lecture Notes by Andrew NgBishop, Ch. 7 for a different perspectiveBarber, Ch. 17.5Boyd, Ch. 5 |
3 | Sept. 7 | Support Vector Machines with Slack | Bishop, Ch. 7.1, SVMs & SVMs with slack |
4 | Sept. 14 & 16 | Decision Treesk Neareset Neighbor | Mitchell, Ch. 3 Bishop, Ch. 14.4 |
5 | Sept. 19 & 21 | Learning Theory & PAC Bounds VC Dimension & Bias/Variance Trade-off | Ng, PAC Learning Notes |
6 | Sept. 26 & 38 | Bias/Variance Trade-off & Bagging Boosting | Bishop, Ch. 14 Hastie et al., Ch. 8.7 & Ch. 15 Short Intro. to AdaBoost |
7 | Oct. 5 & 7 | Clustering PCA | Hastie et al., Ch. 14.3.6, 14.3.8, 14.3.9, 14.3.12 Bishop, Ch. 9.1 PCA Notes |
8 | Oct. 10 & 12 | Midterm (in class Oct. 12) | More PCA |
9 | Oct. 17 & 19 | Bayesian Methods Naive Bayes | Bishop, 1.5, 4.2-4.3.4 Bishop, 2-2.3.4 |
10 | Oct. 24 & 26 | Logistic Regression Gaussian Mixture Models | Bishop, 8.4.1, 9.2, 9.3, 9.4 Other Mixture Model Notes (1) (2) |
11 | Nov. 2 & 4 | Hidden Markov Models | Bishop 13.1-2,Other HMM Notes |
12 | Nov. 7 & 9 | Bayesian Networks LDA | LDA Survey Article Intro. to Bayesian Networks Bishop 8.1 |
12 | Nov. 14 & 16 | Collaborative Filtering | |
13 | Nov. 28 & 30 | Neural Networks | Nielsen Ch. 1-3 Bishop 5.1-5.3 ConvNetJS Demos |
14 | Dec. 5 & Dec. 7 | Reinforcement Learning Review | Sample Exams [1] [2] |
Problem Sets
All problem sets will be available on the eLearning site and are to be turned in there. See the homework guidelines below for the homework policies.
Textbooks & References
The following textbook will be used for the course.
- Pattern Recognition and Machine Learning by Christopher M. Bishop
- Bayesian Reasoning and Machine Learning by David Barber (free online)
- Machine Learning by Tom Mitchell
- Machine Learning: a Probabilistic Perspective by Kevin Murphy
Exams
All exams will be closed book and closed notes.
Midterm: October 10, in class.
Final: Dec. 14th, 11:00am
Homework Guidelines*
I expect you to try solving each problem set on your own. However, if you get stuck on a problem, I encourage you to collaborate with other students in the class, subject to the following rules:
- You may discuss a problem with any student in this class, and work together on solving it. This can involve brainstorming and verbally discussing the problem, going together through possible solutions, but should not involve one student telling another a complete solution.
- Once you solve the homework, you must write up your solutions on your own, without looking at other people's write-ups or giving your write-up to others.
- In your solution for each problem, you must write down the names of any person with whom you discussed it. This will not affect your grade.
- Do not consult solution manuals or other people's solutions from similar courses - ask the course staff, we are here to help!
*adpated from David Sontag