Semester:  Fall 2020, also offered on Spring 2020, Spring 2018, Fall 2017 and Fall 2016 
Time and place:  Tuesday and Thursday, 3.00pm4.15pm EST 
Instructor:  Jean Honorio (Please send an email for appointments) 
TAs: 
Chuyang Ke, email: cke at purdue.edu, Office hours: Monday 10amnoon EST Kevin Bello, email: kbellome at purdue.edu, Office hours: Friday 2pm4pm EST 
Date  Topic (Tentative)  Notes 
Tue, Aug 25  Lecture 1: perceptron (introduction)  Homework 0: due on Aug 27, 11.59pm EST  NO EXTENSION DAYS ALLOWED 
Thu, Aug 27  Lecture 2: perceptron (convergence), maxmargin classifiers, support vector machines (introduction)  Homework 0 due  NO EXTENSION DAYS ALLOWED 
Tue, Sep 1  Lecture 3: nonlinear feature mappings, kernels (introduction), kernel perceptron  Homework 0 solution 
Thu, Sep 3 
Lecture 4: SVM with kernels, dual solution Refs: [1] [2] (not mandatory to be read) 
Homework 1: due on Sep 10, 11.59pm EST 
Tue, Sep 8 
Lecture 5: oneclass problems (anomaly detection), oneclass SVM, multiway classification, direct multiclass SVM Refs: [1] [2] [3] [4] (not mandatory to be read) 

Thu, Sep 10 
Lecture 6: rating (ordinal regression), PRank, ranking, rank SVM Refs: [1] [2] (not mandatory to be read) 
Homework 1 due 
Tue, Sep 15  Lecture 7: linear and kernel regression, feature selection (information ranking, regularization, subset selection)  Homework 2: due on Sep 22, 11.59pm EST 
Thu, Sep 17  Lecture 8: ensembles and boosting  
Tue, Sep 22  Lecture 9: performance measures, crossvalidation, biasvariance tradeoff, statistical hypothesis testing  Homework 2 due 
Thu, Sep 24  Lecture 10: model selection (VC dimension, generalization, structural risk minimization)  Homework 3: due on Oct 1, 11.59pm EST 
Tue, Sep 29  Lecture 11: probability review (joint, marginal and conditional probability), independence, maximum likelihood estimation  
Thu, Oct 1  Lecture 12: generative probabilistic modeling, maximum likelihood estimation, decision boundary  Homework 3 due 
Tue, Oct 6  Lecture 13: mixture models, EM algorithm, convergence, model selection  
Thu, Oct 8  MIDTERM (lectures 1 to 12) 
Start: Thursday October 8, 3.00pm EST End: Friday October 9, 3.00pm EST 
Tue, Oct 13  (midterm solution)  
Thu, Oct 15  — 
Project plan due (see Assignments for details) [Word] or [Latex] format 
Tue, Oct 20 
Lecture 14: active learning, kernel regression, Gaussian processes Refs: [1] (not mandatory to be read) 

Thu, Oct 22  Lecture 15: dimensionality reduction, principal component analysis (PCA), kernel PCA  Homework 4: due on Oct 29, 11.59pm EST 
Tue, Oct 27 
Lecture 16: collaborative filtering (matrix factorization), structured prediction (maxmargin approach) Refs: [1] (not mandatory to be read) 

Thu, Oct 29 
Lecture 17: Bayesian networks (motivation, examples, graph, independence) Refs: [1] [2] (not mandatory to be read) 
Homework 4 due 
Tue, Nov 3 
Lecture 18: Bayesian networks (independence, equivalence, learning) Refs: [1] [2] [3, chapters 1620] (not mandatory to be read) 

Thu, Nov 5 
Lecture 19: Bayesian networks (introduction to inference), Markov random fields, factor graphs Refs: [1] [2] (not mandatory to be read) 
Preliminary project report due (see Assignments for details)  NO EXTENSION DAYS ALLOWED 
Tue, Nov 10  —  
Thu, Nov 12 
Lecture 20: Markov random fields (inference, learning) Refs: [1] [2] [3, chapters 1620] (not mandatory to be read) 

Tue, Nov 17  (lecture 20 continues)  
Thu, Nov 19  Lecture 21: Markov random fields (inference in general graphs, junction trees)  Final project report due (see Assignments for details)  NO EXTENSION DAYS ALLOWED 
Mon, Nov 23  FINAL EXAM (lectures 13 to 21) 
Start: Monday November 23, 4.15pm EST End: Tuesday November 24, 4.15pm EST 
Thu, Nov 26  THANKSGIVING VACATION  
Tue, Dec 1  (final exam solution) 