Readings and topics
References
- Nonlinear optimization
- Numerical methods in optimization by Jorge Nocedal and Stephen J. Wright. Springer 2006.
- Primal-dual interior-point methods by Stephen J. Wright, SIAM 1997.
- Linear and Nonlinear Optimization. Igor Griva, Stephen G. Nash, Ariela Sofer SIAM, 2009.
- Least squares
- Numerical methods for least squares problems by Åke Björck, SIAM 1996.
- Convex
- Convex optimization by Stephen Boyd and Lieven Vandenberghe.
Cambridge University Press, 2004.
Lecture 19
(Rescheduled) Global convergence of quasi-Newton
- Reading
- Nocedal and Wright Chapter 6
- Quasi-Newton Convergence
Lecture 30
Global optimization; quadratic programming
- Readings
- Global optimization
- Nocedal and Wright Chapter 16
Lecture 29
Nonlinear programming: penalty and augmented Lagragian methods.
- Readings
- Nocedal and Wright Chapter 17, 18,
- Griva, Sofer, Nash, Chapter 16.
- Nonlinear programming slides
Lecture 28
Derivative free optimization: gradient methods, pattern search, Nelder-Mead
- Readings
- Nocedal and Wright Chapter 9
- Derivative-free optimization
Lecture 27
Large-scale optimization: simple methods, scalable linear algebra, and limited memory quasi-newton.
- Readings
- Nocedal and Wright Chapter 7
- Griva Sofer Nash, Chapter 13
- Large scale optimization handout
Non linear equations and conjugate gradient
- Matlab
- matlab/homotopy.m
- matlab/homotopy_example.m
- Readings
- Nocedal and Wright Chapter 11
- Solving Nonlinear equations with Newton’s method by C. T. Kelley
- Codes for Solving Nonlinear equations with Newton’s method by C. T. Kelley
- Nocedal and Wright Chapter 5
Lecture 25
Rescheduled.
Lecture 24
Trust region methods
- Readings
- Nocedal and Wright Chapter 4
- Trust region methods
Lecture 23
Quasi-Newton methods
- Readings
- Nocedal and Wright Chapter 6
- Quasi-newton methods
Lecture 22
A Practical treatment of Newton’s method, Cholesky and Modified Cholesky, Zoutendijk’s condition, quadratic convergence.
- Readings
- Nocedal and Wright Chapter 3
Lecture 21
Interior point methods
- Matlab
- matlab/ip_central.m
- Readings
- Nocedal and Wright Chapter 14
Lecture 20
Matrix calculus and finite differences
- Finite difference reading
- Nocedal and Wright, Section 8.1
- Matrix calculus reading
- http://justindomke.wordpress.com/2011/03/24/matrix-calculus/
- http://people.rit.edu/jcdicsa/courses/SML/01background.pdf
- http://www.ee.ic.ac.uk/hp/staff/dmb/matrix/calculus.html
- http://mathoverflow.net/questions/5209/notions-of-matrix-differentiation
- http://www.cs.berkeley.edu/~pliang/cs281a/recitation-0924.pdf
- http://research.microsoft.com/en-us/um/people/minka/papers/matrix/minka-matrix.pdf
- http://ccrma.stanford.edu/~dattorro/matrixcalc.pdf
- http://www.atmos.washington.edu/~dennis/MatrixCalculus.pdf
- http://www.colorado.edu/engineering/CAS/courses.d/IFEM.d/IFEM.AppD.d/IFEM.AppD.pdf
- http://people.kyb.tuebingen.mpg.de/suvrit/work/mcal.pdf.gz
- Matlab
- lecture_matrix_calculus.m
Lecture 19
Rescheduled.
Lecture 18
Midterm
Lecture 17
Review for midterm
Lecture 16
Stephen Wright’s NIPS lecture
- Reading
- http://pages.cs.wisc.edu/~swright/nips2010/sjw-nips10.pdf
- http://blog.videolectures.net/nips-2010-tutorial-optimization-algorithms-in-machine-learning/
- http://videolectures.net/nips2010_wright_oaml/
Lecture 15
We went over some questions on the simplex algorithm, looked at the convergence of backtracking line-search methods.
- Reading
- Griva Sofer Nash, Theorem 11.7
- Convergence of backtracking line search
Lecture 14
We saw the wolfe conditions for line search and proved that a point always satisfies them.
- Reading
- Wolfe Conditions
- Nocedal and Wright, Lemma 3.1.
- Nocedal and Wright, Chapter 2, 3
- Griva Sofer Nash, Chapter 11
Lecture 13
- We saw an intro to line search methods for optimization
- Reading
- Line Search Intro
- Unconstrained optimization methods
- Nocedal and Wright, Chapter 2, 3
- Griva Sofer Nash, Chapter 11
Lecture 12
We finished up the simplex method and went over how it works.
- Reading
- Griva, Sofer, and Nash, chapter 4.
- Nocedal and Wright, Chapter 13.
- Simple Simplex Code
- Matlab
- simplex_iterates.m
- simplex_step.m
Lecture 11
We proved the fundamental theorem of linear programming, that if a solution exists, then there is a solution at a vertex of the feasible polytope.
- Reading
- Griva, Sofer, and Nash, chapter 4.
- Nocedal and Wright, Chapter 13.
- Polytope Geometry
We didn’t discuss these matlab programs, but you might want to look at them for the homework.
- Matlab (you might want to look at these for the homework)
- simplex_iterates.m
- simplex_step.m
Lecture 10
We saw optimality conditions for linear programs and started studying the feasible solution polytope.
- Reading
- Griva, Sofer, and Nash, chapter 4.
- Nocedal and Wright, Chapter 13.
- LP optimality
- Matlab
- lp_poly.m
Lecture 9
We proved that steepest descent converges at a linear rate on a quadratic objective.
- Reading
- Steepest Descent
- Two Remarks on the Kantorovich Inequality, by P. Henrici.
Lecture 8
We saw the necessary conditions and the sufficient conditions for local minima of optimization problems with linear inequality constraints.
We also started to look at the gradient descent method
- Reading
- Griva, Sofer, Nash Section 14.4
- Matlab
- lecture_steepest_descent.m
- gradient_descent_1.m
- gradient_descent_2.m
- exact_gradient_descent.m
Lecture 7
We got an intro to Lagrange multipliers and descend directions. The quiz was interesting and took most of class for partners to work out.
- Reading
- Griva Sofer Nash, Chapter 14.
- Nocedal and Wright Chapter 2 (for descent directions)
Lecture 6
We saw algorithms for constrained least squares problems, and also theory for equality constrained optimization.
- Matlab
- conls.m
- Sports Example with Constraint
- Reading
- Björck Chapter 2
- Björck Chapter 5
- Griva Sofer Nash, Chapter 12.
Lecture 5
Multivariate taylor, gradients, Hessians, multivariate optimality conditions, convexity, and least squares.
- Reading
- Björck Chapter 1
- Nocedal and Wright Section 2.1, you should read Theorem 2.5 for yourself, and review the multivariate generalizations.
- Matlab
- Optimality conditions
Lecture 4
We covered the fundamentals of unconstrained optimization and saw: univarite Taylor series, first and second order necessary conditions for univariate functions.
We saw how to formulate a Sudoku puzzle as a binary LP.
- Reading
- Nocedal and Wright Section 2.1, you should read Theorem 2.5 for yourself, and review the multivariate generalizations.
- Matlab
- Sudoku example
- sudoku_gurobi.m
Lecture 3
Demos of optimization software.
- Matlab
- CVX Example
- Poblano Example
Lecture 2
We saw sequences and rates of convergence.
- Reading
- Nocedal and Wright Appendix A.2
- Matlab
- convergence.m
- convergence_examples.m
Lecture 1
We reviewed the syllabus, and saw the xkcd raptor problem as a motivation to optimization.
See the lecture slides at right.
- Reading
- Syllabus
- Matlab
- raptorchase.m
- show_raptorchase.m
- optimize_raptorchase.m