a logo for the course

Matrix Computations

David Gleich

Purdue University

Fall 2017

Course number CS-51500

Tuesday and Thursday, 10:30-11:45am

Location Forney B124


Homework 10

Homework 10

Please answer the following questions in complete sentences in a typed, clearly prepared manuscript. This homework is due by Monday, November 20th, 2017, 4:30am. However, everyone is given an automatic, full-points extension until Wednesday, November 22nd, 2017 at 4pm. This is designed to give you maximum flexibility in order to plan your time. We highly suggest sticking to the original due date, but there is absolutely no penalty for taking more time.

Problem 0: Homework checklist

Problem 1: Arnoldi and the Krylov Matrix

Denote the th Krylov subspace In steps of the Arnoldi process we consruct an orthogonal matrix that is an orthonormal basis for . Define the th Krylov matrix as . Suppose we compute the QR factorization of . Show that, up to signs, the matrix from Arnoldi and the orthogonal matrix in the QR factorization are the same.

Problem 2: GMRES and polynomials

It turns out that the GMRES solution to in step , , can be viewed as constructing a degree polynomial, q(x), such that is minimized. Think of it this way: since is in , we know for some scalars . If we define the polynomial , then we can express .

Then since GMRES constructed to minimize , we know that the polynomial is the degree polynomial that minimizes , since .

  1. Suppose GMRES converged to the exact solution to the system in iterations, i.e. it constructed in such that is the exact solution to . Show that there exists a degree polynomial such that

  2. Let be a square matrix (where is the column vector of all 1s of appropriate dimensions). Suppose we use GMRES to solve the system . In exact arithmetic, how many steps will GMRES take to converge to the exact solution? (Hint: figure out how to use part 1). Be sure to explain \textit{how you determine what the answer is. Note, your result should be self contained and not utilize the theory discussed in class -- this is good practice for the final; if you do use a theorem to justify your answer, you may lose a few points.}

  3. Write the solution from part 2 in terms of and . Explain how you determine it.

Problem 3: Krylov-ish methods

Read about the steepest descent method in Golub and van Loan, section 11.3.2. The consider algorithm 11.3.1. Notice that the algorithm computes the residual at each iteration, except they call it instead of : If was the residual before and is the next residual computed, show that and are orthogonal.