CS 4375: Introduction to Machine Learning

Fall 2019

Course Info

Introduction to machine learning theory and practice. Learning outcomes: Where: ECSS 2.203
When: MW, 10:00am-11:15am

Instructor: Nicholas Ruozzi
Office Hours: W 11:30-12:30, M 12:30pm-1:30pm, and by appointment in ECSS 3.409

TA: Hailiang Dong
Office Hours: T 10:30-12:00, R 14:00-15:30 in ECSS 2.104A1 (located inside the open lab)

Grading: problem sets (50%), midterm (20%), final (30%)

Prerequisites: CS3345 and CS3341 are required. Familiarity with programming, basic probability, algorithms, multivariable calculus, and linear algebra is also assumed.

Schedule & Lecture Slides

Week Dates Topic Readings
1 Aug. 19 & 21 Introduction & Regression
Perceptron
Bishop, Ch. 1
Sections 1 & 2 of Andrew Ng's Lecture Notes
2 Aug. 26 & Aug. 28 Support Vector Machines
Duality & Kernel Methods
Lecture Notes by Andrew Ng
Bishop, Ch. 7 for a different perspective
Barber, Ch. 17.5
Boyd, Ch. 5
3 Sept. 5 More Duality and Kernel Methods
4 Sept. 9 & 11 Support Vector Machines with Slack
Decision Trees
Mitchell, Ch. 3
Bishop, Ch. 7.1, SVMs & SVMs with slack
Bishop, Ch. 14.4
Online Probability Notes
5 Sept. 16 & 18 k Neareset Neighbor
Learning Theory & PAC Bounds
Ng, PAC Learning Notes
6 Sept. 23 & 25 VC Dimension & Bias/Variance Trade-off Bishop, Ch. 14
7 Sept. 30 & Oct. 2 Bias/Variance Trade-off & Bagging
Midterm (in class Oct. 2)
Hastie et al., Ch. 8.7 & Ch. 15
8 Oct. 7 & 9 Boosting
Clustering
Short Intro. to AdaBoost
Hastie et al., Ch. 14.3.6, 14.3.8, 14.3.9, 14.3.12
Bishop, Ch. 9.1
9 Oct. 14 & 16 Hierarchical Clustering
PCA
Hastie et al., Ch. 14.3.6, 14.3.8, 14.3.9, 14.3.12
PCA Notes
10 Oct. 21 & 23 Bayesian Methods
Naive Bayes
Bishop, 1.5, 4.2-4.3.4
Bishop, 2-2.3.4
Bishop, 8.4.1, 9.2-9.4
11 Oct. 28 & 30 Logistic Regression
12 Nov. 4 & Nov. 6 Gaussian Mixture Models
Mixture Model Notes (1) (2)
13 Nov. 11 & 13 Neural Networks Nielsen Ch. 1-3
Bishop 8.1, 5.1-5.3
ConvNetJS Demos
14 Nov. 18 & 20 More Neural Networks
Reinforcement Learning
Nielsen Ch. 3
Reinforcement Learning Notes
RL Book
14 Dec. 2 & Dec. 4 Practical ML Advice
Review

Problem Sets

All problem sets will be available on eLearning and are to be turned in there. See the homework guidelines below for homework policies.

Textbooks & References

There is no required textbook, but the following books may serve as useful references for different parts of the course.

Exams

All exams will be closed book and closed notes.
Midterm: October 3, in class
Final: December 9, 11:00AM - 1:45PM, ECSS 2.203

Homework Guidelines*

I expect you to try solving each problem set on your own. However, if you get stuck on a problem, I encourage you to collaborate with other students in the class, subject to the following rules:
  1. You may discuss a problem with any student in this class, and work together on solving it. This can involve brainstorming and verbally discussing the problem, going together through possible solutions, but should not involve one student telling another a complete solution.
  2. Once you solve the homework, you must write up your solutions on your own, without looking at other people's write-ups or giving your write-up to others.
  3. In your solution for each problem, you must write down the names of any person with whom you discussed it. This will not affect your grade.
  4. Do not consult solution manuals or other people's solutions from similar courses - ask the course staff, we are here to help!
Late homework will NOT be accepted except in extreme circumstances or those permitted by university policy (e.g., a religious holiday). All such exceptions MUST be cleared in advance of the due date.

UT Dallas Course Policies and Procedures

For a complete list of UTD policies and procedures, see here.


*adpated from
David Sontag