
# Homework 3

Please answer the following questions in complete sentences in submit the solution on Blackboard by the due date there.

• Update 2016-01-28: Removed the old first question that was duplicated; hence questions start at 2.

## Problem 0: List your collaborators.

Please identify anyone, whether or not they are in the class, with whom you discussed your homework. This problem is worth 1 point, but on a multiplicative scale.

## Problem 2: Convexity and least squares

1. Show that $f(x) = \| \vb - \mA \vx \|^2$ is a convex function. Feel free to use the result proved on the last homework.

2. Show that the null-space of a matrix is a convex set.

## Problem 3: Ridge Regression

The Ridge Regression problem is a variant of least squares: This is also known as Tikhonov regularization.

1. Show that this problem always has a unique solution, for any $\mA$ if $\lambda > 0$, using the theory discussed in class so far.

This property is one aspect of the reason that Ridge Regression and used. It is also a common regularization method that can help avoid overfitting in a regression problem.

2. Use the SVD of $\mA$ to characterize the solution as a function of $\lambda$.

3. What is the solution when $\lambda \to \infty$?
What is the solution when $\lambda \to 0$.

## Problem 4: Alternate formulations of Least Squares

Consider the constrained least squares problem: where $\mC \in \RR^{m \times n}$, $n \le m$ and rank $n$.

1. Convert this problem into the standard constrained least squares form.

2. Form the augmented system from the Lagrangian as we did in class.

3. Manipulate this problem to arrive at the normal equations for a least-squares problem: $\mC^T \mC \vy = \mC^T \vb$.
Discuss any advantages of the systems at intermediate steps.