a logo for the course

Computational methods in optimization

David Gleich

Purdue University

Spring 2012

Course number CS 59000-OPT

Tuesday and Thursday, 3:00-4:15pm

Lawson B134


Homework 2

Homework 2

Please answer the following questions in complete sentences in a typed manuscript and submit the solution to me in class on January 31th, 2012.

In this homework, we probe a bit of theory around the topics we’ve covered in class so far. The goal is not to provide tedious work, but useful results that’ll rely upon later. For instance, the first problem tackles subspaces and null-spaces, which will frequently arise in our discussion of constrained optimization. The second problem asks you use the tools in class to discuss a variation on least squares that often arises in many applications. The third problem deals with convexity of a subspace in relationship to the constrained least squares problem. Finally, the fourth problem shows a different way of looking at a least squares problem.

Problem 1: Constrained least squares theory

Let and . Consider the constrainted least squares problem:

\MINone{}{\| \vb - \mA \vx\|}{\mB \vx = \vd}.
  1. Show that is equivalent to the condition that the intersection between the null-spaces of and only has the trivial vector .

  2. Show that the problem may not have any solution if .

  3. Show that the problem always has a unique solution, for any , when and .

  4. Show that if the problem has a unique solution, for any , then . (You can combine this with the proof of the last one if you wish.)

Problem 2: Ridge Regression

The Ridge Regression problem is a variant of least squares:

\MIN{}{\| \vb - \mA \vx\|_2^2 + \lambda \| \vx\|_2^2. }

This is also known as Tikhonov regularization.

  1. Show that this problem always has a unique solution, for any if , using the theory discussed in class so far.

This property is one aspect of the reason that Ridge Regression and used. It is also a common regularization method that can help avoid overfitting in a regression problem.

  1. Use the SVD of to characterize the solution as a function of .

  2. What is the solution when ?
    What is the solution when .

Problem 3: Convexity

  1. Show that is a convex function. Feel free to use the result proved on the last homework.

  2. Show that the null-space of a matrix is a convex set.

  3. Explain why these results allow us to conclude that constrained least squares is a convex optimization problem. (See Wikipedia for the definition of a convex problem.)

Problem 4: Alternate formulations of Least Squares

Consider the constrained least squares problem:

\MINone{\vr,\vy}{ \| \vr \|_2 } { \vr = \vb - \mC\vy }

where , and rank .

  1. Convert this problem into into the standard constrained least squares form.

  2. Form the augmented system from the Lagrangian as we did in class.

  3. Manipulate this problem to arrive at the normal equations for a least-squares problem: .
    Discuss any advantages of the systems at intermediate steps.