Please answer the following questions in complete sentences in a typed manuscript and submit the solution to me in class on February 21th, 2012.
The previous homework was focused heavily on implementations and studying these methods in a practical setting. In this homework, we will return our focus back to theory.
Please identify anyone, whether or not they are in the class, with whom you discussed your homework.
In class, your professor mentioned that any solution to the trust region sub-problem,
\MINone{\vp}{f(\vx_k) + g(\vx_k)^T \vp + 1/2 \vp^T \mH(\vx_k) \vp \quad = \quad m_k(\vp)}{\normof{\vp} \le \Delta,}
could be characterized as follows:
Use these conditions to show that, if , then solves:
\MINone{\vp}{f(\vx_k) + g(\vx_k)^T \vp + 1/2 \vp^T \mH(\vx_k) \vp}{\normof{\vp} = \Delta.}
Hint First show that for any .
Is a dog-leg method or a subspace minimization method better when using a trust region method on a convex optimization problem? Discuss and explain why.
Read the proof of Lemma 4.2. There is one missing step. Let be symmetric positive definite. Prove the inequality:
\frac{(\vg^T \vg)^2}{\vg^T \mB \vg \vg^T \mB^{-1} \vg} \le 1.
(Hint: Use Cauchy-Schwarz!) Can you remove any of the conditions from this proof? When do you have equality?