$\newcommand{\eps}{\varepsilon} \newcommand{\kron}{\otimes} \DeclareMathOperator{\diag}{diag} \DeclareMathOperator{\trace}{trace} \DeclareMathOperator{\rank}{rank} \DeclareMathOperator*{\minimize}{minimize} \DeclareMathOperator*{\maximize}{maximize} \DeclareMathOperator{\subjectto}{subject to} \newcommand{\mat}[1]{\boldsymbol{#1}} \renewcommand{\vec}[1]{\boldsymbol{\mathrm{#1}}} \newcommand{\vecalt}[1]{\boldsymbol{#1}} \newcommand{\conj}[1]{\overline{#1}} \newcommand{\normof}[1]{\|#1\|} \newcommand{\onormof}[2]{\|#1\|_{#2}} \newcommand{\MIN}[2]{\begin{array}{ll} \minimize_{#1} & {#2} \end{array}} \newcommand{\MINone}[3]{\begin{array}{ll} \minimize_{#1} & {#2} \\ \subjectto & {#3} \end{array}} \newcommand{\MINthree}[5]{\begin{array}{ll} \minimize_{#1} & {#2} \\ \subjectto & {#3} \\ & {#4} \\ & {#5} \end{array}} \newcommand{\MAX}[2]{\begin{array}{ll} \maximize_{#1} & {#2} \end{array}} \newcommand{\MAXone}[3]{\begin{array}{ll} \maximize_{#1} & {#2} \\ \subjectto & {#3} \end{array}} \newcommand{\itr}[2]{#1^{(#2)}} \newcommand{\itn}[1]{^{(#1)}} \newcommand{\prob}{\mathbb{P}} \newcommand{\probof}[1]{\prob\left\{ #1 \right\}} \newcommand{\pmat}[1]{\begin{pmatrix} #1 \end{pmatrix}} \newcommand{\bmat}[1]{\begin{bmatrix} #1 \end{bmatrix}} \newcommand{\spmat}[1]{\left(\begin{smallmatrix} #1 \end{smallmatrix}\right)} \newcommand{\sbmat}[1]{\left[\begin{smallmatrix} #1 \end{smallmatrix}\right]} \newcommand{\RR}{\mathbb{R}} \newcommand{\CC}{\mathbb{C}} \newcommand{\eye}{\mat{I}} \newcommand{\mA}{\mat{A}} \newcommand{\mB}{\mat{B}} \newcommand{\mC}{\mat{C}} \newcommand{\mD}{\mat{D}} \newcommand{\mE}{\mat{E}} \newcommand{\mF}{\mat{F}} \newcommand{\mG}{\mat{G}} \newcommand{\mH}{\mat{H}} \newcommand{\mI}{\mat{I}} \newcommand{\mJ}{\mat{J}} \newcommand{\mK}{\mat{K}} \newcommand{\mL}{\mat{L}} \newcommand{\mM}{\mat{M}} \newcommand{\mN}{\mat{N}} \newcommand{\mO}{\mat{O}} \newcommand{\mP}{\mat{P}} \newcommand{\mQ}{\mat{Q}} \newcommand{\mR}{\mat{R}} \newcommand{\mS}{\mat{S}} \newcommand{\mT}{\mat{T}} \newcommand{\mU}{\mat{U}} \newcommand{\mV}{\mat{V}} \newcommand{\mW}{\mat{W}} \newcommand{\mX}{\mat{X}} \newcommand{\mY}{\mat{Y}} \newcommand{\mZ}{\mat{Z}} \newcommand{\mLambda}{\mat{\Lambda}} \newcommand{\mSigma}{\ensuremath{\mat{\Sigma}}} \newcommand{\mPbar}{\bar{\mP}} \newcommand{\ones}{\vec{e}} \newcommand{\va}{\vec{a}} \newcommand{\vb}{\vec{b}} \newcommand{\vc}{\vec{c}} \newcommand{\vd}{\vec{d}} \newcommand{\ve}{\vec{e}} \newcommand{\vf}{\vec{f}} \newcommand{\vg}{\vec{g}} \newcommand{\vh}{\vec{h}} \newcommand{\vi}{\vec{i}} \newcommand{\vj}{\vec{j}} \newcommand{\vk}{\vec{k}} \newcommand{\vl}{\vec{l}} \newcommand{\vm}{\vec{l}} \newcommand{\vn}{\vec{n}} \newcommand{\vo}{\vec{o}} \newcommand{\vp}{\vec{p}} \newcommand{\vq}{\vec{q}} \newcommand{\vr}{\vec{r}} \newcommand{\vs}{\vec{s}} \newcommand{\vt}{\vec{t}} \newcommand{\vu}{\vec{u}} \newcommand{\vv}{\vec{v}} \newcommand{\vw}{\vec{w}} \newcommand{\vx}{\vec{x}} \newcommand{\vy}{\vec{y}} \newcommand{\vz}{\vec{z}} \newcommand{\vpi}{\vecalt{\pi}} \newcommand{\vlambda}{\vecalt{\lambda}}$

# Homework 11

Please answer the following questions in complete sentences in a typed, clearly prepared manuscript. This homework is due by Monday, December 4th, 2017, 4:30am. However, everyone is given an automatic, full-points extension until Wednesday, December 6th, 2017 at 4am if you want your homework graded by December 7th in class. This is designed to give you maximum flexibility in order to plan your time. We highly suggest sticking to the original due date, but there is absolutely no penalty for taking more time.

## Problem 0: Homework checklist

• Please identify anyone, whether or not they are in the class, with whom you discussed your homework. This problem is worth 1 point, but on a multiplicative scale.

• Make sure you have included your source-code and prepared your solution according to the most recent Piazza note on homework submissions.

## Problem 1: Using GMRES

It's based on an assignment from Michael Saunders' iterative methods class.

Download the poisson2D.mat file from Tim Davis's sparse matrix repository.
From Julia, you can use the following code to setup $\mA$ and $\vb$ from the linear system.

    using MAT
A = file["Problem"]["A"]
A = (A+A')./2 # make it symmetric
b = file["Problem"]["b"]
b = vec(b)


The matrix A is symmetric-positive definite. But we are going to look at solving $(\mA - \sigma \mI) \vx = \vb$ where $\sigma = 1.7 \times 10^{-2}$. We'll declare a method converged if the relative residual has $2$-norm less than $10^{-8}$.

1. Plot the relative residuals for 100 iterations of the MINRES (minres from IterativeSolvers.jl) method and the GMRES (gmres from IterativeSolvers.jl) method restarted every 30 iterations on a semilogy scale. Describe what you observe, in particular, do the methods converge? For convenience, here is some code to help you get started:

using IterativeSolvers
x = minres(A,b,tol = mytol, maxiter = mymaxiter)
x = gmres(A,b,restart = myrestart, tol = mytol, maxiter = mymaxiter)

2. If you use an incomplete cholesky ichol or incomplete LU ilu preconditioner with GMRES, it will converge. How many iterations does it take? Does MINRES benefit from an incomplete Cholesky preconditioner? We provide you with the ichol and ilu code. Download the file hw11_functions.jl from here http://www.cs.purdue.edu/homes/dgleich/cs515-2017/julia/hw11_functions.jl. The IterativeSolvers package still has some rough edges. We will help you set that up. Here is the code to help you get started

using ItertaiveSolvers
include("hw11_functions.jl")
Lchol = ichol(A)
M = MyPreconditionerType(Lchol)
x = gmres(A,b,restart = 30, tol = mytol, maxiter = mymaxiter,Pl=M)


For MINRES, it turns out that the minres function in IterativeSolvers still doesn't accept preconditioners. We use another package to test minres.

# note that the Krylov package is not a registered Julia package
# yet but you can still add it as follows
Pkg.clone("https://github.com/JuliaSmoothOptimizers/Krylov.jl.git")
using Krylov
using LinearOperators
L = ichol(A)
P1 = opInverse(LowerTriangular(L))
Mprecond = P1'*P1
x = Krylov.minres(A,b,M=Mprecond,itmax=mymaxiter)[1]

3. Show the eigenvalues of the matrix before and after preconditioning. Do you observe any clustering?

## Problem 2: (Trefethen and Bau, 28.3)

A real-symmetric matrix $\mA$ has an eigenvalue of $1$ of multiplicity $8$, while the rest of the eigenvalues are all $\le 0.1$ in absolute value. Describe an algorithm for finding an orthonormal basis of the 8-dimensional eigenspace corresponding to the domininant eigenvalue.