a logo for the course

Computational Methods in Optimization

David Gleich

Purdue University

Spring 2017

Course number CS-52000

Tuesday and Thursday, 1:30-2:45pm

Location Lawson B134

Homework 3

Please answer the following questions in complete sentences in submit the solution on Blackboard by the due date there.

Problem 0: List your collaborators.

Please identify anyone, whether or not they are in the class, with whom you discussed your homework. This problem is worth 1 point, but on a multiplicative scale.

Problem 2: Convexity and least squares

  1. Show that is a convex function. Feel free to use the result proved on the last homework.

  2. Show that the null-space of a matrix is a convex set.

Problem 3: Ridge Regression

The Ridge Regression problem is a variant of least squares: This is also known as Tikhonov regularization.

  1. Show that this problem always has a unique solution, for any if , using the theory discussed in class so far.

    This property is one aspect of the reason that Ridge Regression and used. It is also a common regularization method that can help avoid overfitting in a regression problem.

  2. Use the SVD of to characterize the solution as a function of .

  3. What is the solution when ?
    What is the solution when .

Problem 4: Alternate formulations of Least Squares

Consider the constrained least squares problem: where , and rank .

  1. Convert this problem into the standard constrained least squares form.

  2. Form the augmented system from the Lagrangian as we did in class.

  3. Manipulate this problem to arrive at the normal equations for a least-squares problem: .
    Discuss any advantages of the systems at intermediate steps.