a logo for the course

Computational Methods in Optimization

David Gleich

Purdue University

Spring 2026

Course number CS-52000

Tuesday, Thursday 9:00-10:15

Location Forney B124


Readings and topics

References

Nonlinear optimization
Algorithms for Optimization by Mykel Kochenderfer and Tim A. Wheeler. MIT Press, 2019.
Numerical methods in optimization by
Jorge Nocedal and Stephen J. Wright.
Springer 2006.
Primal-dual interior-point methods by Stephen J. Wright, SIAM 1997.
Linear and Nonlinear Optimization. Igor Griva, Stephen G. Nash, Ariela Sofer SIAM, 2009.
Least squares
Numerical methods for least squares problems by Åke Björck, SIAM 1996.
Convex
Convex optimization by Stephen Boyd and Lieven Vandenberghe.
Cambridge University Press, 2004.

Lecture 12

We almost finished up the simplex method and went over how it works by moving from vertex to vertex.

Reading
Griva, Sofer, and Nash, chapter 4.
Nocedal and Wright, Chapter 13.
AlgOpt - Chapter 11 Simplex Method
Julia
Lecture 12 (Enumeration) (html) (ipynb)
Video
Lecture 12

Lecture 11

We proved the fundamental theorem of linear programming, that if a solution exists, then there is a solution at a vertex of the feasible polytope.

Reading
Griva, Sofer, and Nash, chapter 4.
Nocedal and Wright, Chapter 13.
Polytope Geometry
AlgOpt - 11.2.1
Julia
Lecture 11 (LP Feasible Polytopes) (html) (ipynb)
Video
Lecture 11

Lecture 10

We saw optimality conditions for linear programs and the dual.

Reading
Griva, Sofer, and Nash, chapter 4.
Nocedal and Wright, Chapter 13.
LP optimality
AlgOpt - Chapter 11 - what we call standard for is their equality form from 11.1.3
AlgOpt - 11.2.2
Julia
Lecture 10 (Linear programming) (html) (ipynb)
Video
Lecture 10

Lecture 9

We introduced the steepest descent method and shows that it converges at a linear rate on a quadratic objective.

Reading
Griva, Sofer, Nash Section 14.4
Steepest Descent
Two Remarks on the Kantorovich Inequality, by P. Henrici.
Julia
Lecture 9 (Steepest Descent) (html) (ipynb)
Notes
Lecture 9
Video
Lecture 9

Lecture 8

We saw the necessary and sufficient conditions for linearly inequality constrained optimization. We also saw the definition of a descent direction and an active set.

Reading
Griva, Sofer, Nash Section 14.4
Nocedal and Wright Chapter 2 (for descent directions)
Chapter 10, 11 in AlgOpt
Video
Lecture 8

Lecture 7

We saw the necessary and sufficient conditions for linearly constrained optimization and how they give a system of augmented normal equations for constrained least squares. We also saw how to get an initial point for a null-space method.

Reading
Griva Sofer Nash, Chapter 14.
Julia
Lecture 7 (Constrained Least Squares) (html) (ipynb)
Video
Lecture 7

Lecture 6

We saw algorithms for constrained least squares problems as an intro to constrained optimization.

Reading
Björck Chapter 2
Björck Chapter 5
Video
Lecture 6

Lecture 5

Multivariate taylor, gradients, Hessians, multivariate optimality conditions.

Reading
Section 1.6 in AlgOpt
Nocedal and Wright Section 2.1, you should read Theorem 2.3, 2.4
for yourself, and review the multivariate generalizations.
No minimizer at the origin but a minimizer on all lines a algebraic discussion of the example we saw in class
Discussion on symmetry of Hessian
Wikipedia on symmetric Hessian
Julia
Lecture 5 (Optimality Conditions) (html) (ipynb)
Video
Lecture 5

Lecture 4

We did an intro to 1d optimization.

Reading
Chapter 2 in Nocedal and Wright
Section 1.5 and 1.6 in AlgOpt
Handouts
unconstrained-intro.pdf
Julia
Lecture 4 (Optim.jl) (html) (ipynb)
Lecture 4 (Flux.jl) (html) (ipynb)
Video
Video didn't work today, sorry!

Lecture 3

We covered a quick intro to optimization software.

Julia
Lecture 3 (Least Squares) (html) (jl) (ipynb)
Lecture 3 (Sudoku) (html) (jl) (ipynb)
Video
Lecture 3

Lecture 2

We went over sequences and convergence.

Reading
Appendix A.2 in Nocedal and Wright
Julia (we didn't get around to these in class..., we'll try in the next class)
Lecture 2 (Rates) (html) (ipynb)
Lecture 2 (Limit Points) (html) (ipynb)
Lecture 2 (Limit Points) (html) (ipynb)
Video
Lecture 2

Lecture 1

We reviewed the syllabus, and saw the xkcd raptor problem as a motivation to optimization.

Reading
Syllabus
Slides
Lecture 1
Julia
Lecture 1 (Raptor) (html) (ipynb)
Video
Lecture 1