a logo for the course

Computational methods in optimization

David Gleich

Purdue University

Spring 2012

Course number CS 59000-OPT

Tuesday and Thursday, 3:00-4:15pm

Lawson B134


Homework 5

Homework 5

Please answer the following questions in complete sentences in a typed manuscript and submit the solution to me in class on February 21th, 2012.

The previous homework was focused heavily on implementations and studying these methods in a practical setting. In this homework, we will return our focus back to theory.

Problem 0: List your collaborators.

Please identify anyone, whether or not they are in the class, with whom you discussed your homework.

Problem 1: Solving the trust region problem (Griva, Nash, and Sofer, 11.6.6)

In class, your professor mentioned that any solution to the trust region sub-problem,

\MINone{\vp}{f(\vx_k) + g(\vx_k)^T \vp + 1/2 \vp^T \mH(\vx_k) \vp \quad = \quad m_k(\vp)}{\normof{\vp} \le \Delta,}

could be characterized as follows:

Use these conditions to show that, if , then solves:

\MINone{\vp}{f(\vx_k) + g(\vx_k)^T \vp + 1/2 \vp^T \mH(\vx_k) \vp}{\normof{\vp} = \Delta.}

Hint First show that for any .

Problem 2: Advantages and disadvantages

Is a dog-leg method or a subspace minimization method better when using a trust region method on a convex optimization problem? Discuss and explain why.

Problem 3: Read and prove

Read the proof of Lemma 4.2. There is one missing step. Let be symmetric positive definite. Prove the inequality:

\frac{(\vg^T \vg)^2}{\vg^T \mB \vg \vg^T \mB^{-1} \vg} \le 1.

(Hint: Use Cauchy-Schwarz!) Can you remove any of the conditions from this proof? When do you have equality?