a logo for the course

Computational Methods in Optimization

David Gleich

Purdue University

Spring 2023

Course number CS-52000

Monday, Wednesday, Friday, 1:30-2:20pm

Location Lawson 1106


Homework 2

Please answer the following questions in complete sentences in a clearly prepared manuscript and submit the solution by the due date on Gradescope.

Remember that this is a graduate class. There may be elements of the problem statements that require you to fill in appropriate assumptions. You are also responsible for determining what evidence to include. An answer alone is rarely sufficient, but neither is an overly verbose description required. Use your judgement to focus your discussion on the most interesting pieces. The answer to "should I include 'something' in my solution?" will almost always be: Yes, if you think it helps support your answer.

Problem 0: Homework checklist

Problem 1: Optimization software

We'll be frequently using software to optimize functions, this question will help familiarize you with a piece of optimization software relevant for our study.

The function we'll study is the exponentiated Rosenbrock function:

  1. Show a contour plot of this function
  2. Write the gradient and Hessian of this function.
  3. By inspection, what is the minimizer of this function? (Feel free to find the answer by other means, e.g. looking it up, but make sure you explain why you know that answer must be a global minimizer.)
  4. Explain how any optimization package could tell that your solution is a local minimizer.

  5. Use Optim.jl (or Poblano for Matlab, or Scikit.learn for python) to optimize this function starting from a few different points. Be adversarial if you wish. Does it always get the answer correct? Show your code. You must use a call where you provide the gradient.

Problem 2: Optimization software

Repeat all the steps for problem 1 for the Booth function

Using the same algorithm as in problem 1, which function takes the most iterations when starting from the point (3,3).

Problem 3: Optimization theory

Suppose that (i.e. is univariate) and is four times continuously differentiable. Show that the following conditions imply that is a local minimizer.

i.
ii.
iii.
iv.

Problem 4: Convexity

Convex functions are all the rage these days, and one of the interests of students in this class. Let's do some matrix analysis to show that a function is convex. Solve problem 2.7 in the textbook, which is:

Suppose that , where is an symmetric positive semi-definite matrix. Show that this function is convex using the definition of convexity, which can be equivalently reformulated: for all and all .

This type of function will frequently arise in our subsequent studies, so it's an important one to understand.