CS 69000-SML: Statistical Machine Learning II

Semester: Spring 2019, also offered on Fall 2021 and Spring 2017
Time and place: Monday, Wednesday and Friday, 9.30am-10.20am, Lawson Building B134
Instructors: Jean Honorio
Office hours: Lawson Building 2142-J (Please send an e-mail for appointments)

Continuous and combinatorial optimization are at the heart of every machine learning algorithm. This course is an introduction to the theoretical aspects of continuous and combinatorial optimization, relevant to machine learning problems. By using continuous optimization, we will answer questions relevant to classification, such as "where does the dual form of support vector machines come from and why does it work?". By using combinatorial optimization, we will answer questions relevant to influence maximization in networks, such as "why does the greedy approach allows for maximizing influence within a factor of 1-1/e?".

Learning Objectives

During the course, students will:

Prerequisites

Basic knowledge from calculus and linear algebra is required. Some knowledge from machine learning, data mining or artificial intelligence is also required (CS57300, CS57800 or CS5900-AI0). Necessary concepts from optimization will be provided during the course.

Textbooks

For convex continuous optimization, we will use Convex Optimization by Boyd and Vandenberghe.
For nonconvex continuous optimization, we will use papers from the leading machine learning conferences and journals.
For combinatorial optimization, we will use lecture notes.

Assignments

There will be up to six homeworks, one paper presentation, one midterm exam and one final exam (dates posted on the schedule). The homeworks are to be done individually.

Grading

Homeworks: 40%
Paper presentation: 10%
Midterm exam: 25%
Final exam: 25%

Late policy

Assignments are to be submitted by the due date listed. Each person will be allowed seven days of extensions which can be applied to any combination of assignments during the semester. Use of a partial day will be counted as a full day. Extensions cannot be used after the final day of classes. Please, use the extension days wisely!

Assignments will NOT BE accepted if they are more than five days late.

Academic Honesty

Please read the departmental academic integrity policy here. This will be followed unless we provide written documentation of exceptions. We encourage you to interact amongst yourselves: you may discuss and obtain help with basic concepts covered in lectures and homework specification (but not solution). However, unless otherwise noted, work turned in should reflect your own efforts and knowledge. Sharing or copying solutions is unacceptable and could result in failure. You are expected to take reasonable precautions to prevent others from using your work.

Additional course policies

Please read the general course policies here.

Schedule

We will start by covering chapters 1-5,9-11 in Convex Optimization by Boyd and Vandenberghe (B&V).
Date Topic (Tentative) Notes
Mon, Jan 7 B&V Chapter 1: introduction Identities on matrix calculus
Wed, Jan 9 B&V Chapter 2: convex sets
Fri, Jan 11     (lecture continues)
Mon, Jan 14 B&V Chapter 3: convex functions
Examples: equation 1 in [1], section 3 in [2] (not mandatory to be read)
Wed, Jan 16
Fri, Jan 18     (lecture continues)
Mon, Jan 21 MARTIN LUTHER KING JR. DAY
Wed, Jan 23 Homework 1: due on Jan 30
Fri, Jan 25     (lecture continues)
Mon, Jan 28 B&V Chapter 4: convex optimization problems
Wed, Jan 30 ADVERSE WEATHER Homework 1 due (moved to Feb 1)
Fri, Feb 1     (lecture continues) Homework 1 due
Mon, Feb 4     (lecture continues)
Wed, Feb 6     (lecture continues) Homework 1 solution
Fri, Feb 8 Homework 2: due on Feb 15
Mon, Feb 11 B&V Chapter 5: duality
Example: [1] (not mandatory to be read)
Wed, Feb 13     (lecture continues)
Fri, Feb 15     (lecture continues) Homework 2 due
Mon, Feb 18     (lecture continues)
Wed, Feb 20     (lecture continues) Homework 2 solution
Homework 3: due on Feb 27
Fri, Feb 22     (lecture continues)
Mon, Feb 25 B&V Chapter 9: unconstrained optimization
Wed, Feb 27     (lecture continues) Homework 3 due
Fri, Mar 1 Presentation: Wei [1]
Mon, Mar 4 Homework 3 solution
Wed, Mar 6 MIDTERM (Chapters 1-4) 9.30am-10.20am at Lawson Building B134
Fri, Mar 8 Presentation: Chuyang [1]
Mon, Mar 11 SPRING VACATION
Wed, Mar 13 SPRING VACATION
Fri, Mar 15 SPRING VACATION
Mon, Mar 18 B&V Chapter 10: equality constrained optimization Homework 4: due on Mar 25
Wed, Mar 20     (lecture continues)
Fri, Mar 22 Presentation: Raphael [1]
Mon, Mar 25 B&V Chapter 11: interior-point methods, by Asish Ghoshal Homework 4 due (moved to Mar 27)
Wed, Mar 27     (lecture continues, by Asish Ghoshal) Homework 4 due
Fri, Mar 29 Presentation: Krishna [1]
Mon, Apr 1 Matroids and the greedy algorithm Homework 4 solution
Wed, Apr 3     (lecture continues) Homework 5: due on Apr 12
Fri, Apr 5
Mon, Apr 8 Submodular optimization
Wed, Apr 10 Greedy algorithms for submodular maximization
Fri, Apr 12 Presentation: Zitao [1] Homework 5 due
Mon, Apr 15 Convergence rates of gradient descent for constrained optimization
Wed, Apr 17     (lecture continues)
Fri, Apr 19 Presentation: Abi [1]
Mon, Apr 22     (lecture continues)
Wed, Apr 24 FINAL EXAM (Chapters 5, 9-11, matroids, submodularity) 9.30am-10.20am at Lawson Building B134
Fri, Apr 26 Presentation: Kevin [1]