$\newcommand{\eps}{\varepsilon} \newcommand{\kron}{\otimes} \DeclareMathOperator{\diag}{diag} \DeclareMathOperator{\trace}{trace} \DeclareMathOperator{\rank}{rank} \DeclareMathOperator*{\minimize}{minimize} \DeclareMathOperator*{\maximize}{maximize} \DeclareMathOperator{\subjectto}{subject to} \newcommand{\mat}{\boldsymbol{#1}} \renewcommand{\vec}{\boldsymbol{\mathrm{#1}}} \newcommand{\vecalt}{\boldsymbol{#1}} \newcommand{\conj}{\overline{#1}} \newcommand{\normof}{\|#1\|} \newcommand{\onormof}{\|#1\|_{#2}} \newcommand{\MIN}{\begin{array}{ll} \minimize_{#1} & {#2} \end{array}} \newcommand{\MINone}{\begin{array}{ll} \minimize_{#1} & {#2} \\ \subjectto & {#3} \end{array}} \newcommand{\MINthree}{\begin{array}{ll} \minimize_{#1} & {#2} \\ \subjectto & {#3} \\ & {#4} \\ & {#5} \end{array}} \newcommand{\MAX}{\begin{array}{ll} \maximize_{#1} & {#2} \end{array}} \newcommand{\MAXone}{\begin{array}{ll} \maximize_{#1} & {#2} \\ \subjectto & {#3} \end{array}} \newcommand{\itr}{#1^{(#2)}} \newcommand{\itn}{^{(#1)}} \newcommand{\prob}{\mathbb{P}} \newcommand{\probof}{\prob\left\{ #1 \right\}} \newcommand{\pmat}{\begin{pmatrix} #1 \end{pmatrix}} \newcommand{\bmat}{\begin{bmatrix} #1 \end{bmatrix}} \newcommand{\spmat}{\left(\begin{smallmatrix} #1 \end{smallmatrix}\right)} \newcommand{\sbmat}{\left[\begin{smallmatrix} #1 \end{smallmatrix}\right]} \newcommand{\RR}{\mathbb{R}} \newcommand{\CC}{\mathbb{C}} \newcommand{\eye}{\mat{I}} \newcommand{\mA}{\mat{A}} \newcommand{\mB}{\mat{B}} \newcommand{\mC}{\mat{C}} \newcommand{\mD}{\mat{D}} \newcommand{\mE}{\mat{E}} \newcommand{\mF}{\mat{F}} \newcommand{\mG}{\mat{G}} \newcommand{\mH}{\mat{H}} \newcommand{\mI}{\mat{I}} \newcommand{\mJ}{\mat{J}} \newcommand{\mK}{\mat{K}} \newcommand{\mL}{\mat{L}} \newcommand{\mM}{\mat{M}} \newcommand{\mN}{\mat{N}} \newcommand{\mO}{\mat{O}} \newcommand{\mP}{\mat{P}} \newcommand{\mQ}{\mat{Q}} \newcommand{\mR}{\mat{R}} \newcommand{\mS}{\mat{S}} \newcommand{\mT}{\mat{T}} \newcommand{\mU}{\mat{U}} \newcommand{\mV}{\mat{V}} \newcommand{\mW}{\mat{W}} \newcommand{\mX}{\mat{X}} \newcommand{\mY}{\mat{Y}} \newcommand{\mZ}{\mat{Z}} \newcommand{\mLambda}{\mat{\Lambda}} \newcommand{\mPbar}{\bar{\mP}} \newcommand{\ones}{\vec{e}} \newcommand{\va}{\vec{a}} \newcommand{\vb}{\vec{b}} \newcommand{\vc}{\vec{c}} \newcommand{\vd}{\vec{d}} \newcommand{\ve}{\vec{e}} \newcommand{\vf}{\vec{f}} \newcommand{\vg}{\vec{g}} \newcommand{\vh}{\vec{h}} \newcommand{\vi}{\vec{i}} \newcommand{\vj}{\vec{j}} \newcommand{\vk}{\vec{k}} \newcommand{\vl}{\vec{l}} \newcommand{\vm}{\vec{l}} \newcommand{\vn}{\vec{n}} \newcommand{\vo}{\vec{o}} \newcommand{\vp}{\vec{p}} \newcommand{\vq}{\vec{q}} \newcommand{\vr}{\vec{r}} \newcommand{\vs}{\vec{s}} \newcommand{\vt}{\vec{t}} \newcommand{\vu}{\vec{u}} \newcommand{\vv}{\vec{v}} \newcommand{\vw}{\vec{w}} \newcommand{\vx}{\vec{x}} \newcommand{\vy}{\vec{y}} \newcommand{\vz}{\vec{z}} \newcommand{\vpi}{\vecalt{\pi}} \newcommand{\vlambda}{\vecalt{\lambda}}$ # Computational methods in optimization

## Announcements

2023-03-22
Project details posted
2023-02-27
Homework 7 posted
2023-02-20
Homework 6 posted
2023-02-13
Homework 5 posted
2023-02-06
Homework 4 posted
2023-01-30
Homework 3 posted
2023-01-23
Homework 2 posted
2023-01-16
Homework 1 posted
2023-01-06
Welcome to class, please complete the intro survey by class on 2023-01-13 (submit on Gradescope)

## Overview

This course is a introduction to optimization for graduate students for those in any computational field.
It will cover many of the fundamentals of optimization and is a good course to prepare those who wish to use optimization in their research and those who wish to become optimizers by developing new algorithms and theory. Selected topics include:

• newton, quasi-newton, and trust region methods for unconstrained problems
• linear programming
• constrained least squares problems
• convex optimization

I will attempt to add a little more ML-based coverage this year, although since that is very common in ML-classes, this class will remain focused on more classic optimization techniques and spend a while on linear programming and how it works, along with quadratic programming.

## Prerequisties

We'll assume you've had some background in numerical linear algebra and rely on that subject heavily. Students with a background in mathematical analysis may be able to appreciate some of the more theoretical results as well.

If you have not taken CS515, or had similar material elsewhere, you are going to be challenged by this class.