a logo for the course

Computational methods in optimization

David Gleich

Purdue University

Spring 2012

Course number CS 52000

Tuesday and Thursday, 12:00-1:15pm

Lawson 1106


Homework 3

Please answer the following questions in complete sentences in a typed manuscript and submit the solution to me on blackboard on February 1st, 2012, by 5pm.

Homework 3: Optimality and Constraints

Problem 0: List your collaborators.

Please identify anyone, whether or not they are in the class, with whom you discussed your homework. This problem is worth 1 point, but on a multiplicative scale. (Note that collaboration is not allowed on the bonus question below.)

Problem 1: Optimization software and optimality

We’ll be frequently using software to optimize functions, this question will help familiarize you with two pieces of software: Poblano and the Matlab optimization toolbox.

The function we’ll study is the Rosenbrock function:

f(x) = 100(x_2 - x_1^2)^2 + (1-x_1)^2.

I briefly talked about this function in class and called it the banana function. Now it’s your turn to look at it!

  1. Show a contour plot of this function

  2. Write the gradient and Hessian of this function.

  3. By inspection, what is the minimizer of this function? (Feel free to find the answer by other means, e.g. looking it up, but make sure you explain why you know that answer must be a global minimizer.)

  4. Explain how any optimization package could tell that your solution is a local minimizer.

  5. Use Poblano to optimize this function starting from a few different points. Be adversarial if you wish. Does it always get the answer correct? Use a table or figure to illustrate your findings if appropriate. Show your code to use Poblano. Your code should have comments to explain what it is doing and you should explain any difficulties in implementing this test.

  6. Read about Matlab’s fminunc function and determine how to provide it with Hessian and Gradient information. Use this toolbox to optimize the function starting from a few different points. Show your code to use this function and explain any differences (if any) you observe in comparison to using Poblano.

Problem 2: Log-barrier terms

The basis of a class of methods known as interior point methods is that we can handle non-negativity constraints such as by solving a sequence of unconstrained problems where we add the function to the objective. Thus, we convert

\MINone{}{f(\vx)}{\vx \ge 0}

into

\MIN{}{f(\vx) + b(\vx; \mu).}
  1. Explain why this idea could work. (Hint: there’s a very useful picture you should probably show here!)

  2. Write a matrix expression for the gradient and Hessian of in terms of the gradient vector and the Hessian matrix of .

Problem 3: Do constraints always make a problem hard?

Let . Recall that if is positive-semi-definite than is convex.

  1. Construct and illustrate an example in to show that that is non-convex if is indefinite.

  2. Now, suppose we consider the problem:

\MINone{}{f(\vx)}{\mA \vx = \vb.}

For this problem, show that any local minimizer is a global solution, even if is non-convex.

  1. Is there always a local minimizer? Prove that there is, or show a counter-example. (Hint, there may be some ambiguity in this problem, if so, I’m looking for a discussion and what you think the right answer is, so your reasoning is more important than your actual answer.)

  2. Write down and test an algorithm to solve these problems and find a global minimizer.

Problem 4: Constraints can make a non-smooth problem smooth.

Show that

\MIN{}{\normof{\mA \vx - \vb}_{\infty}}

can be reformulated as a smooth, constrained optimization problem.