\documentclass[]{article}
\usepackage{lmodern}
\usepackage{amssymb,amsmath}
\usepackage{ifxetex,ifluatex}
\usepackage{fixltx2e} % provides \textsubscript
\ifnum 0\ifxetex 1\fi\ifluatex 1\fi=0 % if pdftex
  \usepackage[T1]{fontenc}
  \usepackage[utf8]{inputenc}
\else % if luatex or xelatex
  \ifxetex
    \usepackage{mathspec}
  \else
    \usepackage{fontspec}
  \fi
  \defaultfontfeatures{Ligatures=TeX,Scale=MatchLowercase}
\fi
% use upquote if available, for straight quotes in verbatim environments
\IfFileExists{upquote.sty}{\usepackage{upquote}}{}
% use microtype if available
\IfFileExists{microtype.sty}{%
\usepackage{microtype}
\UseMicrotypeSet[protrusion]{basicmath} % disable protrusion for tt fonts
}{}
\usepackage{hyperref}
\hypersetup{unicode=true,
            pdftitle={Homework 2},
            pdfborder={0 0 0},
            breaklinks=true}
\urlstyle{same}  % don't use monospace font for urls
\IfFileExists{parskip.sty}{%
\usepackage{parskip}
}{% else
\setlength{\parindent}{0pt}
\setlength{\parskip}{6pt plus 2pt minus 1pt}
}
\setlength{\emergencystretch}{3em}  % prevent overfull lines
\providecommand{\tightlist}{%
  \setlength{\itemsep}{0pt}\setlength{\parskip}{0pt}}
\setcounter{secnumdepth}{0}
% Redefines (sub)paragraphs to behave more like sections
\ifx\paragraph\undefined\else
\let\oldparagraph\paragraph
\renewcommand{\paragraph}[1]{\oldparagraph{#1}\mbox{}}
\fi
\ifx\subparagraph\undefined\else
\let\oldsubparagraph\subparagraph
\renewcommand{\subparagraph}[1]{\oldsubparagraph{#1}\mbox{}}
\fi

\title{Homework 2}

\input{preamble.tex}
\title{Homework}



\begin{document}
\maketitle

Please answer the following questions in complete sentences in a clearly
prepared manuscript and submit the solution by the due date on
Gradescope.

Remember that this is a graduate class. There may be elements of the
problem statements that require you to fill in appropriate assumptions.
You are also responsible for determining what evidence to include. An
answer alone is rarely sufficient, but neither is an overly verbose
description required. Use your judgement to focus your discussion on the
most interesting pieces. The answer to ``should I include `something' in
my solution?'' will almost always be: Yes, if you think it helps support
your answer.

\subsection{Problem 0: Homework
checklist}\label{problem-0-homework-checklist}

\begin{itemize}
\item
  Please identify anyone, whether or not they are in the class, with
  whom you discussed your homework. This problem is worth 1 point, but
  on a multiplicative scale.
\item
  Make sure you have included your source-code and prepared your
  solution according to the most recent Edstem note on homework
  submissions.
\end{itemize}

\subsection{Problem 1: Optimization
software}\label{problem-1-optimization-software}

We'll be frequently using software to optimize functions, this question
will help familiarize you with a piece of optimization software relevant
for our study.

The function we'll study is the square root Rosenbrock function:

\[ f(x) = \sqrt{100(x_2 - x_1^2)^2 + (1-x_1)^2)}. \]

\begin{enumerate}
\def\labelenumi{\arabic{enumi}.}
\tightlist
\item
  Show a contour plot of this function
\item
  Write the gradient and Hessian of this function.
\item
  Check that your gradient and Hessian of this function are correct by
  comparing against an automatic differentiation tool like those used in
  \texttt{Flux.jl} as we did in class or something similar in Python.
\item
  By inspection, what is the minimizer of this function? (Feel free to
  find the answer by other means, e.g.~looking it up, but make sure you
  explain \emph{why} you know that answer \emph{must} be a \emph{global}
  minimizer.)
\item
  Explain how any optimization package could tell that your solution is
  a local minimizer.
\item
  Use Optim.jl (or Poblano for Matlab, or Scikit.learn for python) to
  optimize this function starting from a few different points. Be
  adversarial if you wish. Does it always get the answer correct? Show
  your code. You must use a call/tool where you provide the gradient.
\end{enumerate}

\subsection{Problem 2: Optimization
software}\label{problem-2-optimization-software}

Repeat all the steps for problem 1 for the Himmelblau function
\[ f(\vx) = (x_1^2 + x_2 - 11)^2 + (x_1 + x_2^2 -7)^2. \]

Using the same algorithm as in problem 1, which function takes the most
iterations when starting from the point (3,3).

\subsection{Problem 3: Optimization
theory}\label{problem-3-optimization-theory}

Suppose that \(f : \RR \to \RR\) (i.e.~is univariate) and is four times
continuously differentiable.

\begin{enumerate}
\def\labelenumi{\arabic{enumi}.}
\tightlist
\item
  Show that the following conditions imply that \(x^*\) is a local
  minimizer.
\end{enumerate}

\begin{enumerate}
\def\labelenumi{\roman{enumi}.}
\tightlist
\item
  \(f'(x^*) = 0\)\\
\item
  \(f''(x^*) = 0\)\\
\item
  \(f'''(x^*) = 0\)\\
\item
  \(f''''(x^*) > 0\)
\end{enumerate}

\begin{enumerate}
\def\labelenumi{\arabic{enumi}.}
\setcounter{enumi}{1}
\tightlist
\item
  Ask an LLM to give you a univariate function with a local minimizer at
  \(x=2\) where those conditions are false.
\end{enumerate}

\subsection{Problem 4: Convexity}\label{problem-4-convexity}

Convex functions were all the rage when I started teaching this class,
and one of the interests of students in this class. Let's do some matrix
analysis to show that a function is convex. Solve problem 2.7 in the
textbook, which is:

\begin{quote}
Suppose that \(f(x) = x^T Q x\), where \(Q\) is an \(n \times n\)
symmetric positive semi-definite matrix. Show that this function is
convex using the definition of convexity, which can be equivalently
reformulated:
\[ f(y + \alpha(x-y)) - \alpha f(x) - (1-\alpha) f(y) \le 0 \] for all
\(0 \le \alpha \le 1\) and all \(x, y \in \RR^{n}\).
\end{quote}

This type of function will frequently arise in our subsequent studies,
so it's an important one to understand.

\subsection{\texorpdfstring{\textbf{Optional} Problem 5: Fool an
LLM}{Optional Problem 5: Fool an LLM}}\label{optional-problem-5-fool-an-llm}

Can you find a function \(f\) where the LLM gives you the wrong gradient
when you ask it to solve a problem like 1 or 2?

\end{document}
