$\newcommand{\eps}{\varepsilon} \newcommand{\kron}{\otimes} \DeclareMathOperator{\diag}{diag} \DeclareMathOperator{\trace}{trace} \DeclareMathOperator{\rank}{rank} \DeclareMathOperator*{\minimize}{minimize} \DeclareMathOperator*{\maximize}{maximize} \DeclareMathOperator{\subjectto}{subject to} \newcommand{\mat}[1]{\boldsymbol{#1}} \renewcommand{\vec}[1]{\boldsymbol{\mathrm{#1}}} \newcommand{\vecalt}[1]{\boldsymbol{#1}} \newcommand{\conj}[1]{\overline{#1}} \newcommand{\normof}[1]{\|#1\|} \newcommand{\onormof}[2]{\|#1\|_{#2}} \newcommand{\MIN}[2]{\begin{array}{ll} \minimize_{#1} & {#2} \end{array}} \newcommand{\MINone}[3]{\begin{array}{ll} \minimize_{#1} & {#2} \\ \subjectto & {#3} \end{array}} \newcommand{\MINthree}[5]{\begin{array}{ll} \minimize_{#1} & {#2} \\ \subjectto & {#3} \\ & {#4} \\ & {#5} \end{array}} \newcommand{\MAX}[2]{\begin{array}{ll} \maximize_{#1} & {#2} \end{array}} \newcommand{\MAXone}[3]{\begin{array}{ll} \maximize_{#1} & {#2} \\ \subjectto & {#3} \end{array}} \newcommand{\itr}[2]{#1^{(#2)}} \newcommand{\itn}[1]{^{(#1)}} \newcommand{\prob}{\mathbb{P}} \newcommand{\probof}[1]{\prob\left\{ #1 \right\}} \newcommand{\pmat}[1]{\begin{pmatrix} #1 \end{pmatrix}} \newcommand{\bmat}[1]{\begin{bmatrix} #1 \end{bmatrix}} \newcommand{\spmat}[1]{\left(\begin{smallmatrix} #1 \end{smallmatrix}\right)} \newcommand{\sbmat}[1]{\left[\begin{smallmatrix} #1 \end{smallmatrix}\right]} \newcommand{\RR}{\mathbb{R}} \newcommand{\CC}{\mathbb{C}} \newcommand{\eye}{\mat{I}} \newcommand{\mA}{\mat{A}} \newcommand{\mB}{\mat{B}} \newcommand{\mC}{\mat{C}} \newcommand{\mD}{\mat{D}} \newcommand{\mE}{\mat{E}} \newcommand{\mF}{\mat{F}} \newcommand{\mG}{\mat{G}} \newcommand{\mH}{\mat{H}} \newcommand{\mI}{\mat{I}} \newcommand{\mJ}{\mat{J}} \newcommand{\mK}{\mat{K}} \newcommand{\mL}{\mat{L}} \newcommand{\mM}{\mat{M}} \newcommand{\mN}{\mat{N}} \newcommand{\mO}{\mat{O}} \newcommand{\mP}{\mat{P}} \newcommand{\mQ}{\mat{Q}} \newcommand{\mR}{\mat{R}} \newcommand{\mS}{\mat{S}} \newcommand{\mT}{\mat{T}} \newcommand{\mU}{\mat{U}} \newcommand{\mV}{\mat{V}} \newcommand{\mW}{\mat{W}} \newcommand{\mX}{\mat{X}} \newcommand{\mY}{\mat{Y}} \newcommand{\mZ}{\mat{Z}} \newcommand{\mLambda}{\mat{\Lambda}} \newcommand{\mSigma}{\ensuremath{\mat{\Sigma}}} \newcommand{\mPbar}{\bar{\mP}} \newcommand{\ones}{\vec{e}} \newcommand{\va}{\vec{a}} \newcommand{\vb}{\vec{b}} \newcommand{\vc}{\vec{c}} \newcommand{\vd}{\vec{d}} \newcommand{\ve}{\vec{e}} \newcommand{\vf}{\vec{f}} \newcommand{\vg}{\vec{g}} \newcommand{\vh}{\vec{h}} \newcommand{\vi}{\vec{i}} \newcommand{\vj}{\vec{j}} \newcommand{\vk}{\vec{k}} \newcommand{\vl}{\vec{l}} \newcommand{\vm}{\vec{l}} \newcommand{\vn}{\vec{n}} \newcommand{\vo}{\vec{o}} \newcommand{\vp}{\vec{p}} \newcommand{\vq}{\vec{q}} \newcommand{\vr}{\vec{r}} \newcommand{\vs}{\vec{s}} \newcommand{\vt}{\vec{t}} \newcommand{\vu}{\vec{u}} \newcommand{\vv}{\vec{v}} \newcommand{\vw}{\vec{w}} \newcommand{\vx}{\vec{x}} \newcommand{\vy}{\vec{y}} \newcommand{\vz}{\vec{z}} \newcommand{\vpi}{\vecalt{\pi}} \newcommand{\vlambda}{\vecalt{\lambda}}$

# Homework 5

Please answer the following questions in complete sentences in a clearly prepared manuscript and submit the solution by the due date on Blackboard (around Sunday, September 30th, 2018.)

Remember that this is a graduate class. There may be elements of the problem statements that require you to fill in appropriate assumptions. You are also responsible for determining what evidence to include. An answer alone is rarely sufficient, but neither is an overly verbose description required. Use your judgement to focus your discussion on the most interesting pieces. The answer to "should I include 'something' in my solution?" will almost always be: Yes, if you think it helps support your answer.

## Problem 0: Homework checklist

• Please identify anyone, whether or not they are in the class, with whom you discussed your homework. This problem is worth 1 point, but on a multiplicative scale.

• Make sure you have included your source-code and prepared your solution according to the most recent Piazza note on homework submissions.

## Problem 1: Show that the Jacobi method converges on a diagonally dominant system.

One of the common classes of matrices are called diagonally dominant. They have, or can be permuted to have, entries on on the diagonal such that: Suppose that $\mA \vx = \vb$ is a diagonally dominant system. Show that the Jacobi iteration will converge to the solution when, for at least one row $k$, we have:

## Problem 2: Implement Gauss-Seidel and Jacobi for a sparse system of equations

For each iteration through all $n$ rows, you must use work proportional to the number of non-zeros in the matrix. Stop your iteration when the relative residual drops below $10^{-4}$.

Show how many iterations the methods take on the Candyland linear system of equations.

## Problem 3: Eigenvalue Review

Prove or disprove!

1. The eigenvalues of an $n \times n$ real-valued matrix are always real.

2. The solution to $(\mA^T \mA + \gamma I) \vx = \vb$ is unique for any $\gamma > 0$.

3. Every $n \times n$ matrix can be written as the sum of two non-singular matrices.

4. An $n \times n$ real-valued matrix has an orthogonal set of eigenvectors.

5. An $n \times n$ matrix $\mA$ and its transpose $\mA^T$ have the same set of eigenvalues.

## Problem 4: The power method, and beyond!

We'll show that the humble power-method is really rather more interesting than we saw in class! It's a flexible starting point that serves as the basis for almost all of the eigenvalue algorithms.

1. Consider the power method as a Julia code:

 maxit = 1000
lam = 0
for i=1:maxit
x = A*x
x = x./norm(x)
lam = x'*A*x
end


Let $\vx\itn{i}$ be the value of x at the start of the $i$th instance of this loop. And let $\lambda\itn{i}$ be the value of lam at the start of the $i$th iteration. Suppose that $\vx$ and $\lambda$ are the true eigenvector, eigenvalue pair that we are converging too. If $\normof{\vx - \vx\itn{i}} = \eps$, show that: $|\lambda - \lambda^{i}|$ is $O(\eps^2)$ (where the constant may depend on the matrix $A$).

2. Show that if $\vx$ is an eigenvector of $\mA$ with eigenvalue $\lambda$, then $\vx$ is an eigenvector of $\mA^{-1}$. What is the eigenvalue?

## Problem 5: A matrix where Jacobi will not converge.

Show that the Jacobi iteration will not converge for any choice of $\mM$ for the following system

## Problem 6: Find and read a proof.

Find and read two different proofs that Gauss-Seidel will converge for any symmetric positive definite system $\mA$. Give the references. List any steps that were unclear. Which proof do you prefer and why?