CS 6347: Statistical Methods in AI and ML
Spring 2016Course Info
Where: ECSN Building 2.120
When: MW, 11:30am-12:45pm
Instructor: Nicholas Ruozzi
Office Hours: Tuesday 11am-12pm and by appointment in ECSS 3.409
TA: Baoye Xue
Office Hours: MW: 5:00pm-6:00pm at Clark Center CN 1.202D
Grading: problem sets (70%), final project (25%), class participation & extra credit (5%)
Attendance is MANDATORY. The instructor reserves the right to lower final grades as a result of poor attendance.
Prerequisites: some familiarity with basic probability, linear algebra, and introductory machine learning (helpful, but not required).
Schedule & Lecture Slides
Week | Dates | Topic | Readings |
1 | Jan. 11 & 13 | Introduction & Basic Probability Bayesian Networks | K&F: Ch. 1 & 2 |
2 | Jan. 20 | More BNs: D-separation | K&F: Ch. 3Octave (free version of MATLAB) |
3 | Jan. 25 & 27 | Markov Random Fields Variable Elimination & BP | K&F: Ch. 4 & Ch. 9 Darwiche: Ch. 9 |
4 | Feb. 1 & 3 | Approx. MAP EstimationMAP LP | K&F: 13.1-13.5, A.5.3Boyd: Ch. 5.1-5.5Lecture Notes |
5 | Feb. 8 & 10 | Variational Methods | K&F 11.1-11.2, 11.5Sections 1-3 of this paper |
6 | Feb. 15 & 17 | Intro to Sampling Markov Chain Monte Carlo | K&F 12.1-12.3 |
7 | Feb. 22 & 24 | Intro to Machine Learning Maximum Likelihood for Bayesian Networks | K&F: 17.1-17.4 |
8 | Feb. 29 & Mar. 2 | MLE for Log-Linear Models | K&F: 20.1-20.5 |
9 | Mar. 7 & Mar. 9 | Alternatives to MLE Approximate MLE | |
10 | Mar. 21 & Mar. 23 | Expectation Maximization Hidden Markov Models | K&F: 19.1-19.2Box 17.E | 11 | Mar. 28 & Mar. 30 | Bayesian Network Structure Learning LDA | K&F: 20.6 | 12 | Apr. 4 & Apr. 6 | Exponential Families and EP | 13 | Apr. 11 & Apr. 13 | Neural Networks | Nielsen: Ch. 1 |
Problem Sets
All problem sets will be available on the eLearning site and are to be turned in there. See the homework guidelines below for the homework policies.
Textbooks & References
The following textbook is suggested:
- Probabilistic Graphical Models: Principles and Techniques, by Daphne Koller and Nir Friedman.
- Modeling and Reasoning with Bayesian Networks, by Adnan Darwiche.
- Machine Learning: a Probabilistic Perspective, by Kevin Murphy.
Homework Guidelines*
We expect you to try solving each problem set on your own. However, when being stuck on a problem, I encourage you to collaborate with other students in the class, subject to the following rules:
- You may discuss a problem with any student in this class, and work together on solving it. This can involve brainstorming and verbally discussing the problem, going together through possible solutions, but should not involve one student telling another a complete solution.
- Once you solve the homework, you must write up your solutions on your own, without looking at other people's write-ups or giving your write-up to others.
- In your solution for each problem, you must write down the names of any person with whom you discussed it. This will not affect your grade.
- Do not consult solution manuals or other people's solutions from similar courses - ask the course staff, we are here to help!
*adpated from David Sontag